WebSenor is hiring for Remote
Position:- Databricks Developer (Core / Data Platform)
Experience:- 7+ Years
Location: Remote – US Timings
Open Positions: 2
Job Role:-
We are seeking a core Databricks Developer with strong experience in DataOps and Lakehouse architecture to support an enterprise data platform. The role focuses on building, optimizing, and supporting scalable data pipelines using Databricks in a production environment.
Key Responsibilities
• Develop and maintain data pipelines using PySpark and Spark SQL
• Build and manage Delta Lake tables (Bronze, Silver, Gold layers)
• Support Lakehouse architecture and enterprise data platform standards
• Apply DataOps best practices for deployment, monitoring, and reliability
• Work with Unity Catalog for data governance, access control, and lineage
• Optimize Spark jobs for performance and cost efficiency
• Collaborate with onshore teams across engineering, analytics, and platform teams
Required Skills
• 5+ years of data engineering experience
• 7+ years of hands-on Databricks experience
• Strong skills in PySpark, Spark SQL, and Delta Lake
• Working knowledge of DataOps, Unity Catalog, and Lakehouse architecture
• Experience supporting enterprise data platforms in production
• Strong communication and documentation skills
• Strong Azure and ETL experience is mandatory
Optional / Nice-to-Have Skills
• Experience with Azure Data Factory (ADF)
• Experience with Microsoft SQL Server (MS SQL)
• Experience integrating Databricks with Azure-native services
• Familiarity with DevOps tools (Azure DevOps, GitHub)
• Experience working with global or distributed teams preferred
Salary: As per industry standards. No bar for the right candidate.





