- 193 Views
- 5 replies
- 0 kudos
Spark connect client and server versions should be same for executing UDFs
I am trying to execute pandas UDF in databricks. It gives me the following error on serverless compute,File /local_disk0/.ephemeral_nfs/envs/pythonEnv-b11ff17c-9b25-4ccb-927d-06a7d1ca7221/lib/python3.11/site-packages/pyspark/sql/connect/client/core.p...
- 193 Views
- 5 replies
- 0 kudos
- 0 kudos
Serverless is management free which means you cannot choose the image. Hope this helps. Lou.
- 0 kudos
- 775 Views
- 1 replies
- 0 kudos
Linking Workspace IDs to Names in Billing Schema
Hi everyone,We recently enabled UC and the Billing system table to monitor our usage and costs. We've successfully set up a dashboard to track these metrics for each workspace. The usage table includes the workspace_id, but I'm having trouble finding...
- 775 Views
- 1 replies
- 0 kudos
- 0 kudos
I got this from their older version of the dashboard. dbdemos uc-04-system-tablesWhen everything is executed, go to your graph in the dashboard, click three dots in the top right, select in my case "View dataset:usage_overview" then paste/modify sql ...
- 0 kudos
- 8883 Views
- 8 replies
- 1 kudos
How to move a metastore to a new Storage Account in unity catalog?
Hello, I would like to change the Metastore location in Databricks Account Console. I have one metastore created that is in an undesired container/storage account. I could see that it's not possible to edit a metastore that is already created. I coul...
- 8883 Views
- 8 replies
- 1 kudos
- 1 kudos
We ended up 1) deleting the metastore (which only contained catalogs/schemas/tables), 2) creating a new one in the desired storage account and 3) re-populating it by running all Delta Live Tables pipeline. All our underlying raw data is stored in ano...
- 1 kudos
- 27 Views
- 0 replies
- 0 kudos
Databricks Unity Catalog Metastore
Hey everyone,I deleted my Unity Catalog metastore and now want to point it to another Azure storage account (ADLS). However, once a metastore is created, its storage location cannot be changed. Therefore, I deleted the existing metastore and created ...
- 27 Views
- 0 replies
- 0 kudos
- 687 Views
- 1 replies
- 0 kudos
Data bricks is not mounting with storage account giving java lang exception error 480
Hi Everyone,I am currently facing an issue with in our Test Environment where Data bricks is not able to mount with the storage account and we are using the same mount in other environments those are Dev,Preprod and Prod and it works fine there witho...
- 687 Views
- 1 replies
- 0 kudos
- 157 Views
- 2 replies
- 0 kudos
JDBC Driver cannot connect when using TokenCachePassPhrase property
Hello all, I'm looking for suggestions on enabling the token cache when using browser based SSO login. I'm following the instructions found here: Databricks-JDBC-Driver-Install-and-Configuration-Guide For my users, I would like to enable the token ca...
- 157 Views
- 2 replies
- 0 kudos
- 0 kudos
For the error encountered (Cannot invoke "java.nio.file.attribute.AclFileAttributeView.setAcl(...)" because "<local6>" is null) might be permission or file system issues where the token cache store is being accessed. When EnableTokenCache=0, the to...
- 0 kudos
- 657 Views
- 1 replies
- 0 kudos
How to create a mount point to File share in Azure Storage account
Hello All,I have a requirement to create a mount point to file share in Azure Storage account, I did follow the official documentation. However, I could not create the mount point to fileshare.. and the documentation discribed the mount point creatio...
- 657 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Raja,You're correct that the wasbs:// method is for Azure Blob Storage, not File Shares! I believe File Share mounting is different and would require you to use SMB protocol mounted outside of Databricks since File Shares isn't natively supported!...
- 0 kudos
- 4145 Views
- 3 replies
- 1 kudos
Databricks Notebook says "Connecting.." for some users
For some users, after clicking on a notebook the screen says "connecting..." and the notebook does not open.The users are using Chrome browser and the same happens with Edge as well.What could be the reason?
- 4145 Views
- 3 replies
- 1 kudos
- 1 kudos
Even I am facing the same issue. It always keep saying, opening the notepad. Luckily once it is opened and when connected with the cluster, then its getting timeout.
- 1 kudos
- 202 Views
- 0 replies
- 0 kudos
Integrate Genie to teams
Hey, I'm trying to integrate Genie into Teams. I am Admin and have all rights. created a Genie to test. We are encountering an PermissionDeniederror while interacting with Genie API via SDK and workspace token.Details:Workspace URL: https://dbc-125a3...
- 202 Views
- 0 replies
- 0 kudos
- 1121 Views
- 7 replies
- 0 kudos
preloaded_docker_images: how do they work?
At my org, when we start a databricks cluster, it oftens takes awhile to become available (due to (1) instance provisioning, (2) library loading, and (3) init script execution). I'm exploring whether an instance pool could be a viable strategy for im...
- 1121 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello, when we specify docker image with credentials in instance pool configuration, should we also specify credentials in cluster configuration?. as we already have image pulled into the pool instance.
- 0 kudos
- 2865 Views
- 2 replies
- 0 kudos
Is there an automated way to strip notebook outputs prior to pushing to github?
We have a team that works in Azure Databricks on notebooks.We are not allowed to push any data to Github per corporate policy.Instead of everyone having to always remember to clear their notebook outputs prior to commit and push, is there a way this ...
- 2865 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi,pushing to GitHub isn’t allowed, but clearing notebook outputs before internal version control is still important, you can automate this process by using a pre-commit hook or a script within your internal CI/CD pipeline (if one exists). Tools like...
- 0 kudos
- 46347 Views
- 11 replies
- 1 kudos
Error: Folder [email protected] is protected
Hello, On Azure Databricks i'm trying to remove a folder on the Repos folder using the following command : databricks workspace delete "/Repos/[email protected]"I got the following error message:databricks workspace delete "/Repos/[email protected]"Error: Folder ...
- 46347 Views
- 11 replies
- 1 kudos
- 1 kudos
Hello Databricks Forums,When you see the Azure Databricks error message "Folder [email protected] is protected," it means that you are attempting to remove a system-protected folder, which is usually connected to a user's workspace, particularly under the...
- 1 kudos
- 543 Views
- 3 replies
- 2 kudos
Resolved! Cluster by auto pyspark
I can find documentation to enable automatic liquid clustering with SQL code: CLUSTER BY AUTO. But how do I do this with Pyspark? I know I can do it with spark.sql("ALTER TABLE CLUSTER BY AUTO") but ideally I want to pass it as an .option().Thanks in...
- 543 Views
- 3 replies
- 2 kudos
- 2 kudos
Not at the moment. You have to use the SQL DDL commands either at table creation or via alter table command. Hope this help, Louis.
- 2 kudos
- 190 Views
- 2 replies
- 1 kudos
Enroll, Learn, Earn Databricks !!
Hello Team,I had attended the session in CTS Manyata on 22nd April. I am interested in pursuing for the certifications but while enrolling it shows you are not a member of any group.Link for the available certifications and courses: https://community...
- 190 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @samgupta88 you can find it on the partner academy. Everything is listed in the partner portal.
- 1 kudos
- 242 Views
- 4 replies
- 0 kudos
UCX Installation error
Error Message: databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 00127F76E005AE12.
- 242 Views
- 4 replies
- 0 kudos
- 0 kudos
Click into each policy in the Compute UI of the Workspace to see if the policy ID exists. If it does, then the account that invoked the SDK method didn't have workspace admin permissions.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
13 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
122 | |
56 | |
40 | |
30 | |
20 |