17 days old

Data Engineer

Capital One
McLean, VA 22101
7900 Westpark Drive (12131), United States of America, McLean, Virginia

At Capital One, were building a leading information-based technology company. Still founder-led by Chairman and Chief Executive Officer Richard Fairbank, Capital One is on a mission to help our customers succeed by bringing ingenuity, simplicity, and humanity to banking. We measure our efforts by the success our customers enjoy and the advocacy they exhibit. We are succeeding because they are succeeding.                                               

Guided by our shared values, we thrive in an environment where collaboration and openness are valued. We believe that innovation is powered by perspective and that teamwork and respect for each other lead to superior results. We elevate each other and obsess about doing the right thing. Our associates serve with humility and a deep respect for their responsibility in helping our customers achieve their goals and realize their dreams. Together, we are on a quest to change banking for good.

Data Engineer

Investing in the right information security capabilities is essential to what we do for Capital One in protecting our customers and our employees. The CyberML team collects and analyzes vast quantities of data to help detect malware, prevent fraud, and protect customers.

As a lead Data Engineer in CyberML, you will contribute to building data-driven and machine learning solutions to tackle some of the most interesting use cases in the Cybersecurity industry. You will engage with security analysts to understand their problems and challenges, you will work with Cybers senior leadership to shape the vision of cyber engineering and innovation, and you will lead the development team to deliver robust and cutting-edge products.

Who You Are:

  • You are interested in working on challenging problems involving scalability and performance

  • You can effectively collaborate with other teams to work on high-profile initiatives

  • You enjoy learning new technologies and picking up new skills

What The Role Is:

  • Collaborating as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation Big Data and Fast Data applications

  • Building efficient and scalable storage for structured and unstructured data

  • Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Nifi, Storm, and Kafka on AWS Cloud

  • Building and running large-scale NoSQL databases like Elasticsearch and Cassandra

  • Utilizing programming languages like Java, Scala, and Python

  • Designing and building applications for the cloud (AWS, Azure, GCP, DO)

  • Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation, and Test Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Maven, Nexus, Chef, Terraform, Ruby, Git, and Docker

  • Performing unit tests and conducting reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance

Basic Qualifications:

  • Bachelors Degree

  • At least 3 years of professional programming experience in Java, or Scala, or Python, or C++, or Golang

  • At least 3 years of professional experience working on data streaming or data warehousing applications

  • At least 2 years of experience working with Linux-based Operating Systems

  • At least 2 years of experience working with scripting languages

  • At least 2 years of experience working within cloud environments

Preferred Qualifications:

  • Bachelors Degree in Computer Science or related technical discipline

  • Experience with Linux Administration (e.g., Red Hat, CentOS, AWS Linux)

  • Experience working with Elasticsearch and Lucene-based search

  • Experience working with Snowflake data warehouse

  • Experience working with container runtimes (e.g., Docker, rkt, cri-o)

  • Experience working with container frameworks (e.g., Kubernetes, Mesosphere)

  • Experience with streaming analytics, complex event processing, and probabilistic data structures

  • Experience with columnar data stores and MPP

  • AWS Solution Architect Associate or Professional

At this time, Capital One will not sponsor a new applicant for employment authorization for this position.

Categories

Posted: 2020-02-10 Expires: 2020-03-11

Before you go...

Our free job seeker tools include alerts for new jobs, saving your favorites, optimized job matching, and more! Just enter your email below.

Share this job:

Data Engineer

Capital One
McLean, VA 22101

Join us to start saving your Favorite Jobs!

Sign In Create Account
Powered ByCareerCast