It's official! The Socratica Python Kickstarter was a success! Thank you to all of our supporters. Because of you, many more Python videos coming soon!! 💜🦉
You’re a life-saver. I just got a software engineering job because of your SQL videos. I had very little experience with SQL and most of his questions had to do with it. Thank you so much!
Karl, we thought we had written back to you earlier - we were just blown away by your comment and we can't tell you how much it meant to us. Thank you for sharing - it has really inspired us to keep making videos.
@@Socratica You guys are great...I'm having assessment exam on one of the leading MNC and I started watching your videos...... Is the SQL series over or many more videos to come??
For those of you finding it difficult figuring out how to import the 'earthquake' table: 1. in pgAdmin: Tools - Query Tool 2. paste this in the Query Editor: CREATE TABLE public.earthquake ( earthquake_id integer NOT NULL, occurred_on timestamp without time zone, latitude numeric, longitude numeric, depth numeric, magnitude numeric, calculation_method character varying, network_id character varying, place character varying, cause character varying, CONSTRAINT earthquake_pkey PRIMARY KEY (earthquake_id) ) WITH ( OIDS = FALSE ) 3. now execute (F5) 4. right-click the 'earthquake' table 5. use Import (make sure the 'Header' switch is set to 'Yes') 6. how do you like them apples? :-)
Awesome. Thanks Nerdman! Just a note - under the Miscellaneous Section of Import, I had to toggle the HEADER switch from "NO" to "YES to make the import successful.
When trying to import to the table Im getting the error that says ERROR: missing data for column "occurred_on" CONTEXT: COPY earthquake, line 2: "" Any idea how to fix this?
When importing be sure to do the following: 1) Create table first as described below by Nerdman 2) Right click the table, select the "Import/Export" option 3) Set Header to "Yes" 4) Set Delimiter to "," I appreciate the video a lot but please include stuff like this for us newbies!
Warning....This channel is addictive!!! I've spent the last 7 years writing sql queries for a living, and I've found myself watching a video about a Select statement, worse, I'm enjoying every second of it 😄👍
I have struggled with SQL for YEARS, I'm not a "DB S/W Engineer" but just an "ORDINARY S/W Engineer". But just being a "NON DB S/W Engineer" I still had to 're-learn' the basics of SQL over and over. When I'm overwhelmed by ideas in just getting something small to work I tend to FORGET after the project everything SQL'is. This VIDEO like ALL the Socratica Videos on Python and now SQL is 'BRILLIANT"
Love the way you are presenting the material. Very simple, easy to understand and most importantly keeps us engaging and looking for more. Keep these coming. Cheers!!
It's so funny that i'm a chilean guy who is watching and learning this amazing content, i love to learn with this channel, i don't even speak english nice but i have to say that i can understand everything. THANK YOUUU
I heard many classes about sql but was not clear how much you presented it made really easy for many dreamers. Thanks to you will be more if you can start on Java
I love this channel so much! I was having a hard time understanding this and this video make me feel dumb I understood the basic. The way you explain the video is very engaging and amazing! Thank you again!
I think I am in love with ULKA! You are hilarious. You and the whole Socratic, for that matter, create the most amazing content. I am surprised that these are free! You guys are saviours who make the internet (or matrix wink wink..)more beautiful.
I appreciate your way of teaching mam. I became your student from today and I’ll learn from beginner level to professional I’m sure that your videos will make me professional Thanks
For those of you finding it difficult figuring out how to import the 'earthquake' table: 1. in pgAdmin: Tools - Query Tool 2. paste this in the Query Editor: CREATE TABLE public.earthquake ( earthquake_id integer NOT NULL, occurred_on timestamp without time zone, latitude numeric, longitude numeric, depth numeric, magnitude numeric, calculation_method character varying, network_id character varying, place character varying, cause character varying, CONSTRAINT earthquake_pkey PRIMARY KEY (earthquake_id) ) WITH ( OIDS = FALSE ) 3. now execute (F5) 4. right-click the 'earthquake' table 5. Set Header to "Yes" 6) Set Delimiter to "," 7)Import it! And you are good to rock n roll. FYI: To save the csv file, click on 'View raw' , right click and save it as 'earthquake.csv' under all files type or csv if option is available. Thx to @Nerdman and @WarlordSquerk
I absolutely loved your Python videos. They were and still are a class apart from anything on RU-vid. Now you're doing Postgres videos which is a database I just started learning and I can't be happier to see you again.
i love the way you explain without wasting time like others youtubers speak for 1 hour at the end they are not clear one question i have if i want use your csv fille i must create a table based your csv fille right ?
Thank you for making this video, i like when you teach with clearly. Now i know i have to follow this youtube to get more about SQL. Thats the greatest video tutorial about SQL that i ever see because in the end i understand what do you talk about. Good job!
Thank you, I went through an SQL/Azure class in college but learned nothing, continued on with a BI course that expected us to be well versed in SQL. This playlist has taught me more than all my classes combined, in regards to SQL.
If you're querying between two dates you can do WHERE occurred_on between '2020-01-01' and CURRENT_DATE : Or, whatever date you want. Saves you some time ; ) literally
This is a reminder from what I learned years ago: SELECT something FROM somewhere WHERE something = something. That's mostly what I remembered. But yes, ORDER BY and LIMIT. And especially LIMIT, since you sort the table you fetch in C# for example in that language, but getting 500 results back and then just picking the first one is a waste of network and processing power.