We are a channel dedicated to "simplifying marketing"
To the unknown, marketing is "just running ads" and they don't realise that there is an underlying science to it. Sales and marketing has evolved over the centuries. And it now includes neuroscience, technology, psychology and even more disciplines which can seem very complex.
Stay with us and we'll bring to the most important concepts in a simple and easy to understand video.
when I try to configure the destination as per the video, I get the below error message when i test the connections. can anyone help with this error? when i choose the option to use insert statement, the connection establishes successfully however, i get an error of "I don't have permission to create table" so I am not sure what to do. any help will be much appreciated. Configuration check failed State code: NoSuchKey; Message: The specified key does not exist. (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey; Request ID: null; S3 Extended Request ID: null; Proxy: null)
Friend, I like the concept, but so many details you shared are ... not really correct. You distinguish the functionality of TF and pytorch when their use cases heavily overlap. You place keras as a seperate framework, when in fact it uses other NN libraries as a back end (and is even fully integrated into TF). Scikit learn... well, you arnt wrong, but it really is horrifically bad with anything deep learning. Its sortof a shame, because it has lovely tools - that are totally useless in most modern problems. I'm not certain I'd recommend it even, but that really is my subjective opinion. Anyway, nice enough video, I really like the concept, but ya gotta fact check your stuff better if you are going to do this. Looking forward to seeing more.
can you provide the mysql roadmap with a website link or notes? it will be very useful for us as we can't watch the whole video and pause it again and again!! I hope you understand and thank you!
Hi, I have a few questions for you. Are you a marketer or a data scientist? I am an IT graduate working as a paid analyst(DM) and currently pursuing data science. Do you have any suggestions for me? I would appreciate any advice you can offer. 🙏
I've been both a marketer and data scientist in few domains. When it comes to advice here's my 2 cents 1. Being a paid analyst you're in the sweet spot of many rapid changes and innovations in marketing science. You might already know that attribution is getting more challenging than ever, and getting skills in causal inference/ML, incrementality, media mix modeling, attribution modeling, time series forecasting will give you a solid foundation in marketing science. And causal ML can be applied in so many domains when you want to move away from marketing science. 2. I can't stress enough the importance of spending time on job portals just researching what skills employers in your space are looking for and match that against the pay they're offering. On linkedin in the dropdown you can select different countries to widen your search and most will also display salary range. Make a note of most sought after skills and start learning about it. 3. Build end to end projects that get progressively challenging in 1 or 2 topics mentioned above and also from your own research. Once you do a few projects, find the one or two topics that employers are paying more for and go deep into that topic by reading research articles or watch conferences online etc. Hope that helps
DSA will be beneficial, and building a good foundation in this will help with problem solving. I've found @NeetCode youtube channel has some good resources on this (I'm not affiliated with that channel in any way at all, just found it easy to understand and follow)
I'm going through IBM Data Science Course from Coursera should I continue the learning or quit that course and learn from other source also please suggest other sources like on RU-vid.
Doing a course (even if you take the highest quality course) is only half of the equation, the other half is how deep/wide did you understand the concepts and how effectively are you able to communicate those concepts and apply them in solving problems. 1. Go through job portals and study the requirements for a senior position (not entry level) and learn as much as you can about the technologies, tools, skills mentioned. And start using chatgpt and youtube to understand and learn more. 2. In the projects you're doing the course, how much further can you take that project, what can you add to it. Actively look for problems that were unsolved in that project and build an opinion on how you'd solve it in real world at a real company. 3. Connect with people on linkedin and ask for remote mentoring or internship. When you reach out be clear in what you want from them, what problems you're trying to solve. Hope that helps
Any reason why you not just use the UA-Data Studio connection and start building up what you need in Data/Looker Studio and export from there? Or let me ask in a different way - why would you need to build uo the data in Google Sheets first instead of just using the connector between LookerStudio and UA?
Great video! Thank you very much! I like how you intentionally ran into the PERMISSION-DENIED error, and showed us how to solve it. It's probably one of the most common places people get stuck.
Great video! I wonder if you have run into the issue of the Google Analytics API limiting the data to a 14 month window? When I run a Google Analytics Universal export, with a start date of e.g. 2020-01-01, the data always begins at exactly 14 months prior to the current date. This makes it impossible to export historic data prior to the rolling 14 month date. Is there a workaround for this?
Hi, Stella, at the second 28:10 there is a setting "Full refresh | Overwrite". Is it better to select "Incremental | Append", so we don't reload all the data every day and just sync changes?
you didn't do count distinct in the google sheet example, the values were count of orders not count of unique customers also since you took the cumulative sales then revenue per customer calc should be cumulative sales / M0 or total customers in each month
Great video, thanks Stella! Quick question…I’m pulling pretty large data sets (5+ years by day) so we can preserve our historical data and I’m constantly running into sampling issues. Are there any tools that can pull the Google Analytics data without using sampled data?
Are there issues/challenges with sampled data? On the Airbyte website it mentions that could be a possibility but that was written prior to your video so I wonder if that is no longer an issue?
Great Explanation. Can you please advice on this scenario, where tags are firing while pushing through the console but not while opening in the browser?
Great explanation, thank you very much! Question: you are using "Full refresh - Overwrite". That would sync all data from scratch every time the sync runs, right? What would be the ideal sync mode for that case? Do we need dedup history for daily data?
Yes "Full refresh - overwrite" will delete and reload all data. You can look into "Incremental Sync - Deduped History" or other variants for your use case. Here are the docs docs.airbyte.com/understanding-airbyte/connections/. Hope that helps
Hi! Thank you for your amazing video tutorial. Sorry, but I just have still a question: who makes this push of varaibles into datalayer? The developer adds this push code with all the necessary variables on each page? So that after we can access them via dlv in gtm?
@mashamasha722 yes usually the developer adds the datalayer.push() code. Depending on the type of event, the code can be added to each page or specific button clicks etc. For example, add_to_cart/purchase event can be added when the user clicks on Add To Cart/Purchase button using the onClick() event of the button.