cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the conversation to deepen your understanding and maximize your usage of the Databricks platform.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Data Engineering

Join discussions on data engineering best practices, architectures, and optimization strategies with...

10499 Posts

Data Governance

Join discussions on data governance practices, compliance, and security within the Databricks Commun...

446 Posts

Generative AI

Explore discussions on generative artificial intelligence techniques and applications within the Dat...

168 Posts

Machine Learning

Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithm...

906 Posts

Warehousing & Analytics

Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Communi...

575 Posts

Databricks Free Trial Help

Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insight...

59 Posts

Activity in Databricks Platform Discussions

royinblr11
by > New Contributor
  • 91 Views
  • 2 replies
  • 0 kudos

LLM with the largest context window

A Generative AI Engineer is tasked with developing an application that is based on an open-source large language model (LLM). They need a foundation LLM with a large context window. Which model fits this need?DBRX,Llama2-70B,DistilBert MPT-30B.DBRX h...

  • 91 Views
  • 2 replies
  • 0 kudos
Latest Reply
LRALVA
Valued Contributor II
  • 0 kudos

@royinblr11 You're absolutely right to question the answer — the correct model for an application needing a foundation LLM with a large context window is: DBRXWhy DBRX is the Best Fit:It is a foundation model, designed for generation tasks.It support...

  • 0 kudos
1 More Replies
mridultuteja
by > New Contributor II
  • 122 Views
  • 5 replies
  • 0 kudos

external table not being written to data lake

I was following a tutorial to learn databricks from https://youtu.be/7pee6_Sq3VYGreat video btwI am stuck here at 2:52:24I am trying to create an external table directly to data lake but i am facing some weird issue saying no such location exists.I h...

mridultuteja_0-1746405222446.png mridultuteja_1-1746405246157.png
  • 122 Views
  • 5 replies
  • 0 kudos
Latest Reply
mridultuteja
New Contributor II
  • 0 kudos

I still need help in this  

  • 0 kudos
4 More Replies
msserpa
by > New Contributor II
  • 4443 Views
  • 3 replies
  • 0 kudos

Can we turn off Playground & Marketplace for some users?

Hi Everyone,Hope all is well with you!I'm reaching out to the community for some advice on customizing our workspace settings. Specifically, I have two things I'm trying to figure out:Disabling the "Playground" Option: Is there a way to turn off the ...

169300_1-1704556176241.png
Administration & Architecture
Databricks
Marketplace
Playground
  • 4443 Views
  • 3 replies
  • 0 kudos
Latest Reply
Wing139
New Contributor II
  • 0 kudos

how can we turn on the playground option, i cannot find it

  • 0 kudos
2 More Replies
MaartenH
by > Visitor
  • 111 Views
  • 4 replies
  • 1 kudos

Lakehouse federation for SQL server: database name with spaces

We're currently using lakehouse federation for various sources (Snowflake, SQL Server); usually succesful. However we've encountered a case where one of the databases on the SQL Server has spaces in its name, e.g. 'My Database Name'. We've tried vari...

  • 111 Views
  • 4 replies
  • 1 kudos
Latest Reply
Nivethan
New Contributor III
  • 1 kudos

Hi @MaartenH,In Catalog and Schema names having space is not allowed by default while creation, even if notated with backsticks(``). Current best possibility is to rename the schema with only allowed values which you can find here for best practices ...

  • 1 kudos
3 More Replies
SeekingSolution
by > New Contributor II
  • 76 Views
  • 1 replies
  • 0 kudos

Unity Catalog Enablement

Hello,After scouring documentation yesterday, I was finally able to get unity catalog enabled and assigned to my workspace. Or so I thought. When I run the CURRENT METASTORE() command I get the below error:However, when I look at my catalog I can see...

SeekingSolution_0-1746620101890.png SeekingSolution_1-1746620144801.png SeekingSolution_2-1746620282198.png
  • 76 Views
  • 1 replies
  • 0 kudos
Latest Reply
Nivethan
New Contributor III
  • 0 kudos

Hi,Please check if the cluster you are using to run the query as well upgraded to Unity Catalog. Also, follow the best practices outlined here for enablement: https://docs.databricks.com/aws/en/data-governance/unity-catalog/enable-workspacesBest Rega...

  • 0 kudos
SQLBob
by > New Contributor
  • 98 Views
  • 1 replies
  • 0 kudos

Unity Catalog Python UDF to Send Messages to MS Teams

Good Morning All - This didn't seem like such a daunting task until I tried it. Of course, it's my very first function in Unity Catalog. Attached are images of both the UDF and example usage I created to send messages via the Python requests library ...

  • 98 Views
  • 1 replies
  • 0 kudos
Latest Reply
SQLBob
New Contributor
  • 0 kudos

This has been dropped in favor of using a function defined internally within a notebook. If anyone has occasion to get a similar process set up - please let me know.Thanks

  • 0 kudos
drag7ter
by > Contributor
  • 106 Views
  • 3 replies
  • 0 kudos

how to restrict creation serving endpoints in databricks to a user

Is it possible somehow to restrict creation of the serving endpoints to specific users? I want to grant Workspace access under the Entitlements of the specific group, but I want not to allow users of this group create serving endpoints.The only way I...

  • 106 Views
  • 3 replies
  • 0 kudos
Latest Reply
LRALVA
Valued Contributor II
  • 0 kudos

@drag7ter While I mentioned Workspace-level entitlements as an approach, I need to clarify something important: Databricks doesn't currently support creating fully customized entitlements where you can exclude specific permissions like "serving endpo...

  • 0 kudos
2 More Replies
jar
by > New Contributor III
  • 119 Views
  • 3 replies
  • 0 kudos

Define time interval for when a cluster can be active

Hullo good Databricks people.I have a small dedicated cluster being used for Direct Query (PBI) which has a long termination period. I'd like for it to only be active during business hours though, and to set a restriction so that it's not possible to...

  • 119 Views
  • 3 replies
  • 0 kudos
Latest Reply
LRALVA
Valued Contributor II
  • 0 kudos

Hi @jar Your scenario is quite common - managing costs while ensuring availability during business hours for Power BI Direct Query. Let me share some options based on my experience:Scheduled Cluster PoliciesThe most straightforward approach would be ...

  • 0 kudos
2 More Replies
DaPo
by > New Contributor II
  • 68 Views
  • 1 replies
  • 0 kudos

Model Serving Endpoint: Cuda-OOM for Custom Model

Hello all,I am tasked to evaluate a new LLM  for some use-cases. In particular, I need to build a POC for a chat bot based on that model. To that end, I want to create a custom Serving Endpoint for an LLM pulled from huggingfaces. The model itself is...

  • 68 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

Here are some suggestions:  1. Update coda.yaml. Replace the current config with this optimized version:  channels: - conda-forge dependencies: - python=3.10 # 3.12 may cause compatibility issues - pip - pip: - mlflow==2.21.3 - torch...

  • 0 kudos
greenPlatypus
by > New Contributor
  • 194 Views
  • 1 replies
  • 0 kudos

How to Get Access to All-Purpose Compute Cluster While on Free Trial

Hi everyone,I signed up for the Databricks Community Edition free trial with the intention of testing a 3rd-party integration with Databricks. When trying to set up the integration, the account only showed the SQL Warehouses compute option, and not t...

  • 194 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

Hey greenPlatypus -  Free trials only offer serverless/SQL compute clusters (due to resource and cost controls). Standard or Premium tiers provide all-purpose compute clusters, and the Premium tier is recommended for full features and future proofing...

  • 0 kudos
vaibhavaher2025
by > New Contributor
  • 104 Views
  • 1 replies
  • 0 kudos

How to get response from API call made via executor

Hi Guys,I'm trying to call multiple APIs via executor using foreach partition, However as API response is getting returned at executor level I'm unable to see the response of API weather its 200 or 500.I dont want my APIs to execute on driver so I'm ...

  • 104 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

Vaibhavaher2025 -  I recommend trying the following:  1. Write logs from executors to persist storage insideprocess_partition. 2. Use mapPartitions instead offoreachPartition to return responses back to the driver as a Dataframe 3. Check executor log...

  • 0 kudos
VVM
by > New Contributor III
  • 19330 Views
  • 14 replies
  • 3 kudos

Resolved! Databricks SQL - Unable to Escape Dollar Sign ($) in Column Name

It seems that due to how Databricks processes SQL cells, it's impossible to escape the $ when it comes to a column name.I would expect the following to work:%sql SELECT 'hi' `$id`The backticks ought to escape everything. And indeed that's exactly wha...

  • 19330 Views
  • 14 replies
  • 3 kudos
Latest Reply
rgower
New Contributor II
  • 3 kudos

+1 here - hoping to hear any updates.

  • 3 kudos
13 More Replies
anmol-aidora
by > New Contributor
  • 200 Views
  • 6 replies
  • 0 kudos

Resolved! Serverless: ERROR: Could not install packages due to an OSError: [Errno 13] Permission denied

Hello guys!I am getting this error when running a job:ERROR: Could not install packages due to an OSError: [Errno 13] Permission denied: '/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/some-python-package'I have lis...

  • 200 Views
  • 6 replies
  • 0 kudos
Latest Reply
anmol-aidora
New Contributor
  • 0 kudos

Thanks for clarifying Isi, really appreciate it

  • 0 kudos
5 More Replies
aravind-ey
by > New Contributor
  • 916 Views
  • 5 replies
  • 1 kudos

vocareum lab access

Hi I am doing a data engineering course in databricks(Partner labs) and would like to have access to vocareum workspace to practice using the demo sessions.can you please help me to get the access to this workspace?regards,Aravind

  • 916 Views
  • 5 replies
  • 1 kudos
Latest Reply
twnlBO
New Contributor II
  • 1 kudos

Can you please provide links? screenshot? more info? This answer is not specific enough.I'm taking the Data Analysis learning path, there are different demos I'd like to practice and there are no SP Lab environment links as mentioned in the videos.

  • 1 kudos
4 More Replies
soumiknow
by > Contributor II
  • 3082 Views
  • 22 replies
  • 1 kudos

Resolved! BQ partition data deleted fully even though 'spark.sql.sources.partitionOverwriteMode' is DYNAMIC

We have a date (DD/MM/YYYY) partitioned BQ table. We want to update a specific partition data in 'overwrite' mode using PySpark. So to do this, I applied 'spark.sql.sources.partitionOverwriteMode' to 'DYNAMIC' as per the spark bq connector documentat...

  • 3082 Views
  • 22 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

@soumiknow , Just checking if there are any further questions, and did my last comment help?

  • 1 kudos
21 More Replies
OSZAR »