This channeI is dedicated to all things Salesforce and Snowflake and is presented for the benefit of the community. I am also the author of the upcoming O'Reilly Salesforce Data Cloud book and am the best-selling author of the O'Reilly Snowflake Definitive Guide book.
Hi Joyce, in Data cloud does the no code/visual builder have the capability to create new fields similar to SQL? or the no code approach is functionally just for filtering ready-to-use fields.
"Insightful!" :) Thank you Joyce. Something I've yet to still find out. In those glamourous slides that Salesforce use to show the unified profile, there is metrics often shown on that unifiied profile such as engagement score, lifetime value, etc. Does this mean that its possible to take the result/outcome of a calculated insight, and show it on the unified profile? Many thanks. PS: Looking forward to receive your book!
Can you please help me with these . Would be great if you can explain with some examples. 1. Why link to link relationship is not recommended in RDV? 2. In BDV bridge table, if we are storing only hash keys( not natural keys), then how in fact/dimension we are going to get natural keys?
Cool video. If you want to schedule the refreshes, you need Prep on the server (Prep Conductor). And if you have Prep Conductor, you might as well just publish a data source on the server. Thus,
The videos are very helpful for exam preparations. (Please pay attention, there is a minor mistake in the graph. The latest micro partitions sample, where the partitions are on the same line: overlapping micro partitions is zero; however, overlap depth is 1.)
Hi joyce, thanks for your clear explenation. I still have one question. How do i detect the deltas from my stages? When there is nrt data landing in the s3 bucket?
Thank you so much for the information. My company is looking to purchase Salesforce and need some kind of data cloud so this video was really helpful in explaining what we may need. Again, thank you!
What is your background? Had you take practice tests prior to your first / second attempts? Also can somebody pass the exam if they read and understand your book?
Incredible video Joyce!!! I followed it step by step and within 25 minutes, I have a functional Azure blob storage and Snow pipe. Several UI features in Azure Portal and Snowflake changed since you made the video but I was still able to navigate to the sections you referenced. Thank you again for this great resource.
Hi Joyce. Thank you so much for your efforts. Could you please guide a beginner like me where to start learning. I would like to learn about DAta vault and access snowflake implemented with DATA VAULT.
Hey Joyce, I'm currently studying to take my SnowPro Core exam in about two weeks and have started reading your book, Snowflake the Definitive Guide. It seems your RU-vid series has helped a lot of people pass the exam. Do you think these videos are still relevant enough to pass? I only ask because I see a lot of the comments are 2 years old and I've heard the exam has changed since then. For example, you now need to score 75 points to pass instead of 80. Thanks!
Hi Joyce, thanks for the video. One thing that was a bit confusing that i realised after dragging back to the video, when loannum 49000 was deleted, the row in Stream A is deleted, where else Stream B and C had an extra row identifying that loannum 49000 was deleted
Thanks Joyce for the interesting summary. So seems there's 2 options: 1. configure connect Salesforce Lakehouse to Iceberg. 2. configure connect Salesforce Lakehouse to Snowflake which then can access Iceberg via external tables. 3. does this means we can have both 1 and 2 for example ?
Option 2 is the path because Snowflake has developed platform infrastructure to support fully Iceberg. Option 1 is not really a viable option, at least not in the near-term.
Hi @@JoyceKayAvila but can we have both direct simple raw Iceberg + Snowflake connection types configured in Sales force ? And I wonder how these configs are done.. are they global or can they be per SF domain etc ?
@@emanueol I'm working on gathering and summarizing some of the details that will help to answer your specific questions. Looks like I'll need to prepare a followup video soon :)
DDL stands for Data Definition Language. DML stands for Data Manipulation Language. DDL statements are used to create database, schema, constraints, users, tables etc. DML statement is used to insert, update or delete the records.
Thank you vey much Joyce. The video series as well the additional ones you have been adding are really an asset for understanding the concepts. Appreciate the efforts. They definitely helped me in clearing the Snow Pro Certification.
Great video Joyce. Thank you! Just one minor point. Snowflake recommends using an storage integration object while reading data from Azure BLOB (along with an external stage) which avoids the need to supply SAS credentials when creating stages or loading data.