We live in the age of information. The world is full of Data. ETL is a solution to transform the raw data into meaningful information. The technology that provides a scope of every data(information) and their responsibility organized by every platform of its own.
This channel is to demonstrate on various ETL technologies and its unique manner of managing organized data.
Can we execute the complex CTE queries in polars. I'm unable to do so, if possible could you provide any example of complex query. for eg: query = """WITH sales_2017 AS ( SELECT SUM(TotalNetAmount) as TotalSales_2017 FROM salesorder WHERE EXTRACT(year FROM CAST(OrderDate AS TIMESTAMP)) = 2017 AND EXTRACT(month FROM CAST(OrderDate AS TIMESTAMP)) BETWEEN 1 AND 3 ), sales_2016 AS ( SELECT SUM(TotalNetAmount) as TotalSales_2016 FROM salesorder WHERE EXTRACT(year FROM CAST(OrderDate AS TIMESTAMP)) = 2016 AND EXTRACT(month FROM CAST(OrderDate AS TIMESTAMP)) BETWEEN 1 AND 3 ) SELECT TotalSales_2017, TotalSales_2016 FROM sales_2017 , sales_2016""" getting error : ColumnNotFoundError: TotalSales_2016
When I click on the launch matillion etl it displays an error saying timed out. It doesn’t loads the page. What is the solution for it? I stuck at 6.45
Hello, Yes this course covers all the modules in DBT, instead of big queries, i have used RDS PostgreSql. All the process remains same. You can enroll this course if intrested, www.udemy.com/course/dbt-cloud/?referralCode=2F049047DBF18073DD6C
Hey man are you working previously as DJ.. as music is like I am attending any disco.... no need for this background noise... very distracting... keep it simple.
Hi, This is to create PostgreSql instance. You can follow same process to create Oracle. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-F906og02ejM.html
Hi, thanks for the tips. Follow the steps shown in the video. However, in my case I need to write to an oracle on premises (external database). I created the jdbc connection (it works), I created the crawler pointing to S3(ok), I created the crawler pointing to oracle(ok). I created the Glue job reading from s3 and targeting oracle. However, when running the job, it generates the following error: Thread-5 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassNotFoundException: org.apache.logging.log4j.core.lookup.JndiLookup at java.net.URLClassLoader.findClass(URLClassLoader.java:382) Thread-5 INFO Log4j appears to be running in a Servlet environment, but there's no log4j-web module available. If you want better web container support, please add the log4j-web JAR to your web archive or server lib directory. how to enable log4j?
Sure. I have this Courses created in Udemy and Tutorials point. You can enroll if required. www.udemy.com/course/dbt-cloud/?referralCode=2F049047DBF18073DD6C www.tutorialspoint.com/learn-dbt-data-build-tool/index.asp
In your previous video you had selected the ETL role for the DB permissioning. The explanation in this requested follow-up I would've found most helpful if you demonstrated the creation and setup of the ETL role + policy. In your previous video which was great and generally very comprehensive, this was the only part which I think someone could become stuck on when following along and building this small s3-glue-rds pipeline. This video was good to show what roles where involved but still leaves the setup a mystery. it would be great if you demonstrated this. Thank you for your hard work.
Yes it is a good approach to learn any ETL tool. DBT performs transformation alone. In case of extract and load, you have to be with good understanding of any etl tool.