Charles Schwab Corporation

Senior Big Data Engineer

Job Locations US-TX-Westlake | US-CO-Lone Tree
Requisition ID
2020-63908
Posted Date
2 months ago(6/23/2020 1:35 PM)
Category
Engineering
Position Type
Full time

Your Opportunity

At Schwab, the Global Data Technology (GDT) organization leads the strategy, implementation and management of the enterprise data technology. They enable the management of data as assets and the delivery of data along the value-chain across Schwab.  They help Marketing, Finance, Risk and various P&Ls make fact-based decisions by integrating and analyzing data as well as operationally leverage data for competitive advantage.  The team delivers innovative client experience capability and rich business insight through robust enterprise data-driven capabilities.

Dev Engineering team within GDT focuses on building new frameworks and enhancing existing ones to improve overall project delivery efficiency and maintain coding standards. We are looking for Senior Software Development Engineer to realize this vision for the data platform, to help evolve our service practices, and to build frameworks to support the continued evolving needs of our user base. The individual should have passion for data technologies and mindset to identify and implement innovative ideas to mature our service operations

What you are good at

  • Diagnose / fix highly complex technical issues independently
  • Communicate individual and project-level development statuses, issues, risks, and concerns to technical leadership and management
  • Create documentation and training related to technology stacks and standards within assigned team
  • Coach and mentor junior engineers in engineering techniques, processes, and new technologies; enable others to succeed
  • Experience collaborating with business and technology partners and offshore development teams
  • Working with product owners and technical directors to lead technical discussions and resolve technical issues
  • Strong skills in design, development and delivery of data solutions on Teradata, Big Data and cloud data platforms.
  • Developing and maintaining code for data ingestion and curation using Informatica, Talend, Scoop, Hive etc
  • Managing day-to-day development activities for new data solutions and troubleshooting existing implementations.
  • Applying best practices of data integration for data quality and automation
  • Reviewing data models and data architecture for Hadoop and HBase environments
  • Documenting design solutions and supporting documentation.
  • Working with business analysts to understand business requirements and use cases

What you have

  • Bachelor's degree in Computer Science or related discipline
  • Experience with a structured application development methodology, using any industry standard Software Development Lifecycle, in particular Agile Methodologies is required
  • 6+ years of overall experience in I.T. with strong understanding of best practices for building and designing ETL code
  • 5+ years of experience in ETL tools. Specific expertise in implementing Informatica / Talend in an Enterprise environment.
  • Very good experience/understanding on Building Enterprise Data Lake using Talend, Scoop, Hive, Mongo DB, etc
  • Design, build and support data processing pipelines to transform data in Big Data, Teradata platforms, Cloud Platforms (GCP, AWS)
  • Experience in real time data ingestion into Hadoop is required
  • Understanding Hadoop file format and compressions is required
  • Understanding of best practices for building Data Lake and analytical architecture on Hadoop is required
  • Familiarity with MapR distribution of Hadoop is preferred
  • Experience in or deep understanding of cloud based data technology GCP/AWS is preferred
  • Experience with change data capture tools (CDC) preferred such as Attunity
  • Scripting / programming with UNIX, Java, Python, Scala etc. is preferred
  • Hands-on experience in Java object oriented programming (At least 2 years)
  • Hands-on experience with Hadoop, MapReduce, Hive, Pig, Flume, STORM,  SPARK, Kafka and HBASE (At least 3 years)
  • Experience in Active Batch Scheduling , control M preferred
  • Experience with Test Driven Code Development, SCM tools such as GIT, Jenkins is preferred
  • Good verbal and written communication skills
  • Strong interpersonal, analytical, problem-solving, influencing, prioritization, decision- making and conflict resolution skills
  • Ability to thrive in a flexible and fast-paced environment across multiple time zones and locations
  • Experience in Financial Services industry a plus.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Why Schwab?

At Schwab, “Own Your Tomorrow” embodies everything we do! We are committed to helping our employees unleash their potential and achieve their dreams. Our employees get to play a central role in disrupting a multi-trillion-dollar industry, creating a better, more modern way to build and manage wealth. We’re a modern financial services firm that stands apart from the industry, where you can go as far as your ambition takes you.

Hear from employees: What’s it like to work at Schwab!

The benefits of working at Schwab: a package designed to empower your health, wealth, career and life.
Schwab is committed to building a diverse and inclusive workplace where everyone feels valued.

As an equal employment opportunity employer, our policy is to provide equal employment opportunities to all employees and applicants without regard to any status that is protected by law. (Please click here to see policy.)

Schwab is also an affirmative action employer, focused on advancing women, minorities, veterans, and individuals with disabilities in the workplace.
We believe diversity and inclusion are part of our success as a company and our purpose of serving every client with passion and integrity.