| Position Description: -
Utilize the AWS database and data processing services, including Amazon RDS, AuroraDB, DynamoDB, Redshift, ElastiCache, EMR, and Neptune. -
Configure, launch, and manage EMR clusters, optimizing for performance and cost, and integrating with various data sources. -
Architect scalable, high-performance, and secure database solutions. -
Integrate databases with other AWS services such as Lambda, S3, EC2, and Glue. -
Utilize AWS security best practices (IAM, encryption, VPC, security groups) and compliance requirements for data storage. -
Design backup, disaster recovery, high availability, and failover solutions (Multi-AZ, read replicas). -
Troubleshoot for AWS related issues and incidents. -
Design, implement, and manage virtual data warehouses in Snowflake. -
Write complex SQL queries for data analysis, transformation, and reporting. -
Optimize queries, manage caching, clustering, and leveraging auto-scaling to handle workloads. -
Ensure compliance with SSA security, privacy, and data management policies. -
Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience. -
Provide guidance/support to other junior/mid-level developers. -
All other duties as assigned or directed. |
| Skills Requirements: Foundation for Success (Basic Qualifications) -
Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field; Masters or Doctorate degree may substitute for required experience -
7+ years as a Database Architect. -
Experience in AWS database and data processing services, including Amazon RDS, AuroraDB, DynamoDB, Redshift, ElastiCache, EMR, and Neptune. -
Knowledgeable in AWS security best practices (IAM, encryption, VPC, security groups) and compliance requirements for data storage. -
Strong troubleshooting skills for AWS related issues and incidents. -
Skilled in optimizing queries, managing caching, clustering, and leveraging auto-scaling to handle workloads. -
Proficient in automation and scripting (CloudFormation, Terraform, AWS CLI, Python, Bash, PowerShell) for database and workflow management. Factors To Help You Shine (Required Skills) **Selected candidate must be able to obtain and maintain a public trust clearance** **Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week** -
Skilled in both relational and NoSQL database management, data modeling, normalization, and schema design. -
Experienced in configuring, launching, and managing EMR clusters, optimizing for performance and cost, and integrating with various data sources. -
Skilled in designing backup, disaster recovery, high availability, and failover solutions (Multi-AZ, read replicas). -
Advanced skills in writing complex SQL queries for data analysis, transformation, and reporting. -
AWS Experience: -
Proven ability to architect scalable, high-performance, and secure database solutions, with proficiency in query optimization, indexing, caching, and performance tuning using AWS tools. -
Experienced in integrating databases with other AWS services such as Lambda, S3, EC2, and Glue. -
Adept at securing AWS clusters, managing IAM roles, implementing encryption, and monitoring jobs using AWS tools (CloudWatch, CloudTrail). -
Snowflake Experience: -
Proficient in designing, implementing, and managing virtual data warehouses in Snowflake, with a strong understanding of micro-partitions. -
Experienced in schema design, normalization, and building efficient data structures in Snowflake. -
Familiar with data ingestion, transformation, and loading using Snowflake's features and integration tools. -
Knowledgeable in Snowflake's role-based access control, data encryption, and security best practices. -
Experienced in integrating Snowflake with BI tools (Tableau, Power BI) and using Python for data pipelines and machine learning applications How To Stand Out From The Crowd (Desired Skills) -
Experience identifying requirements, researching options, designing, and implementing solutions at a leadership level on project work. -
Experience scripting in a Linux environment to automate ETL solutions and data migration jobs. -
Understand and apply quality techniques and practices (automated unit testing, Test Driven Design/Development, performance analysis, continuous integration) -
Ability to assist with architectural tactical thinking, information solutions and road maps to drive architectural recommendations. -
Detail oriented and committed to consistently completing repetitive production-like tasks on-time and error-free. -
Prior experience working in a federal agency or public sector environment. -
Effective communication skills for working with cross-functional teams, including data scientists, analysts, and business stakeholders. -
Ability to translate technical concepts into non-technical terms for business stakeholders. -
Ability to manage multiple projects, prioritize tasks, and deliver within deadlines. |