This was a great video! It really helped fill in the missing pieces for getting started with SQL Developer. Two hints: SQL Developer seems to get confused when you ask it to create lowercase usernames. It worked better for me when I created an upper case username. Also, I found it easier to create the user in SQLPlus session. Here are my commands: CREATE USER "JOEDATA" IDENTIFIED BY "mypasswd" DEFAULT TABLESPACE "USERS"; ALTER USER "JOEDATA" QUOTA UNLIMITED ON "USERS"; GRANT "DBA" TO "JOEDATA"; COMMIT; Hope this helps.
I have another video in this channel describing the point-in-time recovery process with XE. You can do it with simple RMAN commands. Both point-in-time recovery and hot backups rely on enabling transaction log archiving.
Thanks for the video. I'm wondering, how do we manage the backups to create many of them and then to restore the database to specific point in time? Is it possible using the same Backup / Restore Database scripts bundled with the Oracle installation or do I need to do some manual copying of backups from the BACKUPSET in the fast_recovery_area?
This is really good and very informative in a short time, can you post a link for the document that can be used for reference so that I can refer when I have a situation to look for it. Thank you
It's hard to say, and depends on the kind of data you have, how much you have, how complex your processing rules for SQL*Loader are, what sort of connection you have, the power of your server, and so on. For speed, try to use use direct path instead of conventional path.
For direct path, you can add DIRECT=true to the control file. A fantastic comparison of the differences between direct and conventional path is here: sshailesh.wordpress.com/2009/05/03/conventional-path-load-and-direct-path-load-simple-to-use-in-complex-situations/
Hello there! Have you heard the talk about - Barkas Cloud Storage Backup (do a search on google)? Ive heard some interesting things about it and my mate got great results with it.
Some problems I ran into: - Folder names cannot have space in them - Table needs to be empty, if not empty using truncate not delete statement or use replace command - To save the CTL file, save as -->select all files instead of text file and add ctl after file name
A very nice tutorial to understand tkprof utility. Do you have similar stuff on Explain Plan? I am also looking for some video showing how optimize query performance even for a very simple queries using explain plan or any other tool.
Thanks Stephen, just two questions, what user should log in or could this be done with any particular user? and the other question is should we turn off the two session parameters when finished or by disconnected oracle automatically does it for us? Thanks alot.
Thanks so much , but i think we found a way, we are going to ftp the file to database server (unix) and then from there run a cron job. i am still in the process of doing the ftp. will keep you updated. thanks for the help. really appreciate it.
The username and password you will use for SQL Loader is just the database login - no process will work without having one of those, and so I would make that your first priority. If you are using SSIS, I assume you are working mainly in a Windows environment. You can't post links in RU-vid comments, but there are many examples on the web for creating a batch script that calls SQL Loader, which you could run from Windows job scheduler.
Hi Steve, thanks so much for the reply, do you have some example that i can take a look at it. this sounds to complicated to me. I have done several ssis jobs and so was wondering if i can do in ssis, but could not succeed so i tried sql loader, i use the sql developer wizard, but its only one time load into the table, i tried to do what u have here on youtube, but i think i do not have username and pwd. so am really stuck, have really been trying to do this for so many days now. let me kno
Sure - you could run a scheduled job / cron that would perform this process from a batch / shell script. You might also want to look into using more advanced ETL tools with built-in job schedulers. For example, you could check out the free Pentaho suite which supports scheduling for Kettle (ETL) jobs.