Roles: Databricks Admin
Location : Pan India
Experience: 5+ yrs
Notice Period: 0-60 days
Roles & Responsibilities:
- Workspace Management: Create and manage Databricks workspaces, ensuring proper configuration and access control.
- User & Identity Management: Administer user roles, permissions, and authentication mechanisms.
- Cluster Administration: Configure, monitor, and optimize Databricks clusters for efficient resource utilization.
- Security & Compliance: Implement security best practices, including data encryption, access policies, and compliance adherence.
- Performance Optimization: Troubleshoot and resolve performance issues related to Databricks workloads.
- Integration & Automation: Work with cloud platforms (AWS, Azure, GCP) to integrate Databricks with other services.
- Monitoring & Logging: Set up monitoring tools and analyze logs to ensure system health.
- Data Governance: Manage Unity Catalog and other governance tools for structured data access.
- Collaboration: Work closely with data engineers, analysts, and scientists to support their workflows.
- Qualifications: Proficiency in Python or Scala for scripting and automation.
- Knowledge of cloud platforms (AWS).
- Familiarity with Databricks Delta Lake and MLflow.
- Understanding of ETL processes and data warehousing concepts.
- Strong problem-solving and analytical skills.
Mandatory Skills Databricks Admin, Python, Scripting
Location : Pan India
Experience: 5+ yrs
Notice Period: 0-60 days
Roles & Responsibilities:
- Workspace Management: Create and manage Databricks workspaces, ensuring proper configuration and access control.
- User & Identity Management: Administer user roles, permissions, and authentication mechanisms.
- Cluster Administration: Configure, monitor, and optimize Databricks clusters for efficient resource utilization.
- Security & Compliance: Implement security best practices, including data encryption, access policies, and compliance adherence.
- Performance Optimization: Troubleshoot and resolve performance issues related to Databricks workloads.
- Integration & Automation: Work with cloud platforms (AWS, Azure, GCP) to integrate Databricks with other services.
- Monitoring & Logging: Set up monitoring tools and analyze logs to ensure system health.
- Data Governance: Manage Unity Catalog and other governance tools for structured data access.
- Collaboration: Work closely with data engineers, analysts, and scientists to support their workflows.
- Qualifications: Proficiency in Python or Scala for scripting and automation.
- Knowledge of cloud platforms (AWS).
- Familiarity with Databricks Delta Lake and MLflow.
- Understanding of ETL processes and data warehousing concepts.
- Strong problem-solving and analytical skills.
Mandatory Skills Databricks Admin, Python, Scripting