IT Recruiting/Recruiter/IT Sales Jobs | IntePros Consulting
AWS Big Data Architect (with Hadoop) III
6/30/2020
Mid-Atlantic
Philadelphia, PA
THE INTEPROS DIFFERENCE

IntePros is a certified woman-owned results-oriented recruiting solutions and staffing company. We are the representative of choice for top professionals because we understand what motivates great people. We take an active role in your career and our concerns are long-term. We have an extensive support system and make the commitment to making the right match between you and a company. We are proud of our retention rate: over 90% of our consultants choose to work again with IntePros.

IntePros is founded on the core values of accountability, family, passion, trust and value. IntePros offers all consultants comprehensive medical, dental and vision programs. We also offer direct deposit and a $1,500/year education and professional certification fund.

IntePros does not discriminate in employment on the basis of race, color, religion, sex, pregnancy, gender identity, national origin, sexual orientation, disability, age, veteran or military status, retaliation, or other characteristic protected by law.

Only qualified individuals being considered will be contacted

Data Engineer

Job Description

Join a multi-disciplinary team of devops engineers, software engineers, data analysts, and data scientists working together to improve the Comcast user experience.

Do you likebigchallenges and working within ahighly motivatedteam environment?As a data software engineer on the Data Science and Engineering team within the Next Generation Access Network (NGAN) organization at Comcast, you will be part of a team that thrives onbigchallenges, results, quality, and agility. You will work closely with business stakeholders, data analysts, and data scientists within the organization developing software solutions helping to deliver insights into customer and network behavior that drive business decisions shaping the future of Comcast.

Who does the Data Engineer work with?

You will collaborate with a diverse set of professionals ranging from software engineers whose software integrates with analytics services, network architects and engineers who are tasked with evolving the network, service delivery engineers who provide support for our products, data analysts and data scientists distilling key insights from massive amounts of raw data, operational stakeholders with all manner of information needs,and executives who rely ondatafor fact-based decision making.

What are some interesting problems youll be working on?

Develop large scale, cloud based data pipelines for the collection and processingof device telemetry and network events, providing both a real time and historical view into the operation of our products and services.Work on high performance, real time data stores and a massive historical data sets usingbest-of-breed and industry leading technology. Expose services over REST APIs. Work closely with various engineering teams to solve key optimization, insight and access network data challenges.

Where can you make an impact?

The Data Science and Engineering teamis acquiring, studying, simulating, and modeling to enable data as a key driver and core functional component toward better understanding, predicting, and dynamically optimizing the access network to improve overall user experience. Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

• Developing large scale data pipelines exposing data sources within Comcast to our team of data analysts and data scientists.
• Developing REST APIs utilizing AWS lambda and API Gateway.
• Developing Spark streaming and batch jobs to clean and transform data.
• Writing build automation to deploy and manage cloud resources.
• Writing unit and integration tests.

Some of the specific technologies we use:

• Programming Languages (Python, Scala, Golang, Node.js)
• Build Environment: GitHub Enterprise, Concourse CI, Jira, Serverless, SAM
• Cloud Computing (AWS Lambda, EC2, ECS)
• Spark(AWS EMR, Databricks)
• Stream Data Platforms: Kinesis, Kafka
• Databases: S3, MySQL, Oracle,MongoDB, DynamoDB
• Caching Frameworks (ElasticCache/Redis)

Requirements:

• BS/MS degree in Computer Science, Mathematics, or other relevant science and engineering discipline.
• 4+ years working as a software engineer.
• 2+ years working within an enterprise data lake/warehouse environment or big data architecture.
• Excellent programming skills with experience in at least one of Python, Scala, Java, Node.js.
• Great communication skills.
• Proficiency in testing frameworks and writing unit/integration tests
• Proficiency in Unix-based operating systems and bash scripts.

Preferred Additional Skills:

• Experience with working in Spark
• Experience with AWS
• Experience with monitoring and visualization tools such as Grafana, Prometheus, Data Dog, and Cloudwatch.
• Experience with NoSQL databases, such as DynamoDB, MongoDB, Redis,Cassandra, or HBase

Would you like to apply to this job?

Apply for the AWS Big Data Architect (with Hadoop) III position


Previous MonthNext Month
SunMonTueWedThuFriSat