Share this Job

Title:  Senior Data Engineer


CommScope is pushing the boundaries of communications technology. For more than 40 years, we’ve been leaders in innovating 5G, private networks and Gigabit speeds everywhere – we're always anticipating what’s next. Developments such as the Internet of Things, seamless connectivity, Cloud and 5G introduce new requirements and demand creative thinking. With our unmatched expertise in copper, fiber, and wireless infrastructure, our global clients rely on us to outperform today and be ready for the needs of tomorrow.


Ruckus is a Business Unit within CommScope, focused on delivering cutting edge solutions to build a smarter, simpler, more connected world. At the heart of global connectivity are the engineers who write innovative software for our award-winning routing and switching products to bring the information quickly and reliably where needed. We are a pioneer in the wireless infrastructure market, enabling carriers and enterprises to stay ahead of the exploding demand for high-bandwidth applications and services Our Ruckus Smart Wi-Fi, LTE, and Switching technology redefines what’s possible in wireless network performance with flexibility, reliability, and affordability.



The Team

You will work with a dynamic and focused team to develop state-of-the-art applications for Analytics and Artificial Intelligence (AI) in networking. As our Senior Software Engineer, you will be implementing our core software components, and be involved in scalable design for cloud software architecture. This is an exciting opportunity to join our talented team and be involved in the next technology trend: Analytics & AI in Networking.



You Will Excite Us If You Have

  • Strong working knowledge in SaaS application development and Agile software development methodologies.
  • Good working knowledge using services from any public cloud offerings. Experience in Google Cloud Platform would be a plus.
  • Good analytic skills working with both structured & unstructured datasets.
  • Good experience building big data pipelines, ETL architecture.
  • Working knowledge of stream processing pipeline for big data would be a plus.
  • At least 3 years of experience in Data Engineering role and have good knowledge / working experience in

           - Relational (e.g., PostgreSQL) & non-relational databases such as Graph and time series.

           - Streaming & data pipeline tools, e.g., Kafka, Spark Streaming.

           - Big data processing technologies, e.g., Spark, Hadoop, Ignite, Parquet.

           - Functional programming languages, e.g., Scala.

           - Virtualization and container environment such as Docker and Kubernetes.

           - Compiler and scripting languages, e.g., Golang, Ruby, Node.js.

Job Segment: Database, Engineer, Network, Telecom, Telecommunications, Technology, Engineering