You are currently viewing Sr. AWS Data Engineer

Sr. AWS Data Engineer

Location: Charlotte, NC (Onsite) Duration: Long Term MOI: Video Interview

Job Description:
The Data Engineer will be responsible for the ongoing database engineering process by providing deep subject matter expertise to feature design and serving as an advocate for bringing forth and resolving customer issues. We are seeking a forward-thinking professional that is comfortable using the latest technology offerings & whose main responsibility will be to build & maintain new capabilities to operationalize and automate data methodologies. Position is based in Charlotte, North Carolina, and is part of the Technology, Operations and Development Team.

Main Duties & Responsibilities:

  • Analyze schemes and applications to determine the most appropriate migration strategies and architectures.
  • Leverage the latest technologies and products to convert legacy database schemes and application codes between database engines.
  • Identify and resolve any potential data related technical hurdles.
  • Operate development and production environments in the cloud byrunning and analyzing test results, performing diagnostics and troubleshooting, opening, prioritizing, and helping triage defects &tracking and reporting test status and results.
  • Design solutions and tooling to execute automated database deployments & upgrades.
  • Standardize design patterns and usage of persistence solutions that can be adopted across multiple use cases.
  • Evaluate and drive changes to database and software architectures to address recurring issues or limitations.
  • Research and identify new opportunities for innovation.
  • Other duties may be assigned.

Minimum Requirements:

  • Bachelor’s degree in computer science, engineering, mathematics or related field required.
  • Five (5) years of related engineering experience including experience in development.
  • Minimum education and experience required can be substituted with the equivalent combination of education, training and experience that provides the required knowledge, skills and abilities.
  • Deep knowledge and experience designing and maintaining AWS databases and Data Lake technology.
  • Experience in engineering and managing multiple database flavors for complex production systems.
  • Working knowledge of Amazon Web Services (Aurora, Athena, Dynamo DB, Glue, Sagemaker, S3, etc.).
  • Broad awareness of customer workloads and use cases, including performance, availability, and scalability.
  • Experience analyzing issues holistically, from the application tier through the database, down to the storage.
  • Knowledge of relational database internals (locking, consistency, serialization, recovery paths), and Python scripting language.
  • Coding skills in the procedural language for at least one or more database engines.
  • Excellent communication skills.

Preferred Qualifications:

  • Systems engineering experience for troubleshooting and tuning.
  • Experience ingesting data from different sources and/or ETL development experience.
  • Developing software code in one or more programming languages (Python, Java, etc.)

Tech Stack:

  • AWS Data Solution Architecture
    • AWS Lake Formation / S3
    • AWS Glue
    • AWS Kinesis Data Firehose
    • DynamoDB
    • Athena
    • NoSQL
    • Parquet
    • Aurora DB V2
  • Languages
    • Python, Java

Leave a Reply