Sev1 Tech, Inc.

Senior Databricks Platform Engineer

ID
2025-9383
Type
Full Time W/Benefits Ret Match
Location : Location
US-VA-Arlington
Security Clearance
DHS Suitability

Overview/ Job Responsibilities

We are seeking a highly skilled Senior Databricks Platform Engineer to design, implement, and maintain our enterprise-level Databricks platform supporting a federal customer organization's unified data initiative. This role's primary focus is platform management—centered on building scalable infrastructure, governance frameworks, and operational excellence—with secondary responsibilities in data engineering to support platform optimization and best practices.

 

You will serve as the steward of our Databricks platform, acting as the frontline technical owner responsible for creating robust, scalable processes and frameworks that enable multiple data teams to work efficiently and securely. This position offers a unique opportunity to work with cutting-edge cloud data technologies and shape the foundational infrastructure of our federal customer's data ecosystem.

  

Responsibilities:

  • Maintain enterprise-scale Databricks platform, managing multiple workspaces.
  • Build and maintain platform observability through development of KPIs, executive dashboards, and monitoring solutions that provide visibility into system performance, resource consumption, cost trends, and compliance adherence.
  • Develop and implement scalable user provisioning and de-provisioning processes.
  • Design and manage RBAC (Role-Based Access Control) frameworks and security groups.
  • Integrate with enterprise identity management systems (SAML, SCIM, Active Directory).
  • Manage Unity Catalog implementation and governance structures.
  • Oversee metastore configuration, catalog hierarchies, and data organization.
  • Implement data lineage, auditing, and compliance frameworks.
  • Design and enforce compute policies, cluster configurations, and pool management strategies.
  • Implement cost optimization frameworks and resource allocation policies.
  • Monitor and optimize cluster utilization and performance.
  • Manage serverless SQL warehouses and compute resource governance.
  • Establish and manage connections to diverse source systems (databases, APIs, file systems).
  • Configure and maintain storage integrations (S3, ADLS, GCS, external locations) in addition to setting up and data access patterns.
  • Implement and monitor secure credential management and secrets handling.
  • and mount points.
  • Build and configure Databricks workspaces, clusters, notebooks, jobs, and Delta Lake storage, integrating with AWS services such as S3, IAM, and KMS.
  • Implement and maintain security controls including IAM policies, cluster security configurations, KMS-based encryption, and data masking to protect sensitive data.
  • Monitor and tune Databricks clusters and job workloads for reliability, cost optimization, and performance, leveraging autoscaling and workload management.
  • Apply data governance best practices including cataloging, metadata management, and data lineage using Databricks Unity Catalog and AWS-native capabilities.
  • Collaborate with cloud infrastructure teams to configure supporting AWS components such as S3 storage, networking, logging, monitoring, and access controls.
  • Maintain detailed technical documentation including solution designs, data flow diagrams, configuration standards, and operational procedures.
  • Stay up to date with advancements in Databricks, Delta Lake, Spark, and AWS services, integrating new features that improve automation and efficiency.
  • Partner with data engineers, analysts, and scientists to implement data models and reusable transformation patterns within Databricks and AWS.
  • Troubleshoot and resolve platform-level issues including but not limited to workspace connectivity, cluster startup failures, authentication problems, and infrastructure performance degradation.
  • Ensure compliance with data governance, privacy, and security requirements by applying secure architecture patterns and validating controls throughout the data lifecycle.
  • Support the evaluation and hands-on testing of new AWS and Databricks features or services that enhance the Data Lake environment.

Minimum Qualifications

  • Bachelor's degree in computer science, information technology, or a related field. Equivalent experience will also be considered.
  • Proven experience in building and configuring enterprise-level data lake solutions using Databricks in an AWS or Azure environment.
  • In-depth knowledge of Databricks architecture, including workspaces, clusters, storage, notebook development, and automation capabilities.
  • Deep expertise in Databricks Unity Catalog, workspace management, and security models
  • Experience with big data technologies such as Apache Spark, Apache Hive, Delta Lake, and Hadoop.
  • Solid understanding of data governance principles, data modeling, data cataloging, and metadata management.
  • Hands-on experience with cloud platforms like AWS or Azure, including relevant services like S3, EMR, Glue, Data Factory, etc.
  • Proficiency in SQL and one or more programming languages (Python, Scala, Bash, or PowerShell).
  • Knowledge of data security and privacy best practices, including data access controls, encryption, and data masking techniques.
  • Strong problem-solving and analytical skills, with the ability to identify and resolve complex data-related issues.
  • Excellent interpersonal and communication skills, with the ability to collaborate effectively with technical and non-technical stakeholders.
  • Experience in a senior role, providing technical guidance and mentorship to junior team members.
  • Relevant certifications such as Databricks Certified Developer or Databricks Certified Professional are highly desirable.
  • Eligibility/Clearance Requirements: Must be eligible to obtain a Department of Homeland Security EOD clearance (Requirements 1. US Citizenship, 2. Favorable Background Investigation) 

Desired Qualifications

  • Experience in designing and implementing data ingestion pipelines, data transformations, and data quality processes using Databricks.
  • Experience working with federal government customers and compliance requirements (FedRAMP, FISMA)
  • Experience with CI/CD practices and version control for infrastructure (Git, GitHub/GitLab/Bitbucket, CI/CD pipelines)

Clearance Preference:

  • Active DHS/CISA suitability - 1st priority
  • Any DHS badge + DoD Top Secret - 2nd choice
  • DoD Top Secret + willingness to obtain DHS/CISA suitability - 3rd choice (it can take 10-60 days to obtain suitability – work can only begin once suitability is fully adjudicated).

About Sev1Tech LLC

Founded in 2010, Sev1Tech provides IT, engineering, and program management solutions delivery. Sev1Tech focuses on providing program and IT support services to critical missions across Federal and Commercial Clients. Our Mission is to Build better companies.  Enable better government. Protect our nation. Build better humans across the country.

  • Join the Sev1Tech family where you can achieve great accomplishments while fostering a satisfying and rewarding career progression.  Please apply directly through the website at: https://careers-sev1tech.icims.com/#joinSev1tech
  • For any additional questions or to submit any referrals, please contact: Caitlin.maupin@sev1tech.com
  • Sev1Tech is an Equal Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Need help finding the right job?

We can recommend jobs specifically for you! Click here to get started.