Тёмный
Let's Code with Kundan Kumar
Let's Code with Kundan Kumar
Let's Code with Kundan Kumar
Подписаться
Welcome to a channel created exclusively for tech enthusiasts who have a passion for coding in Python, R, Machine Learning, Django, Big data tools, Hadoop Ecosystem, ReactJS, Docker, Java, Android, Flutter, reactNative, block-based coding (like Scratch, mBlock, Thumbnel, etc.), and various technical courses. Join us for an immersive journey into the world of technology, where you'll find exciting content and gain valuable insights. Subscribe now and elevate your coding skills to new heights! 💻🌟

==Follow me on==
1) LinkedIn: www.linkedin.com/in/kundankumar011

2) Instagram: @kundan.kumar011

2) X/Twitter: @KundanKumar011

3) Facebook: facebook.com/kundan.kumar011?mibextid=ZbWKwL
Creating an app in Django: Practical Guide
15:08
6 месяцев назад
Комментарии
@yercoarancibia3366
@yercoarancibia3366 16 часов назад
This video is themost of complete that see en youtube, excelent video and explication, you have a good job and is the best tutorial
@kundankumar011
@kundankumar011 12 часов назад
@@yercoarancibia3366 Thank you so much for appreciating this Hadoop installation tutorial. It motivates me. I hope you subscribed to my channel dear.
@PrabhatSingh-tf6km
@PrabhatSingh-tf6km 2 дня назад
Sir i love your videos. Can you please make a video on installing hbase?
@kundankumar011
@kundankumar011 2 дня назад
Thank you so much for loving my video. Yes I am going to add into my to-do list once it's ready you will get notifications dear. please you do encourage me by subscribing and my channel and press the bell icon 🔔
@PrabhatSingh-tf6km
@PrabhatSingh-tf6km День назад
@@kundankumar011 i have already subscribed to your channel and recommended my friends as well to do the same. I installed hadoop and hive watching your videos.
@kundankumar011
@kundankumar011 День назад
@@PrabhatSingh-tf6km pleasure dear ❤️
@amazighkab9904
@amazighkab9904 14 дней назад
before executing hdfs namenode -format , you must execute the command rmdir /S "C:\hadoop\data\datanode" otherwise you will get the error Incompatible clusterIDs in C:\hadoop\data\datanode: namenode clusterID = CID-9ffe06b7-e903-46f6-a3fe-508ccc12e155; datanode clusterID = CID-d349
@kundankumar011
@kundankumar011 6 дней назад
Thank you for watching my video. Apologies for the delayed response. I hope you were able to resolve the error. If not, here's a quick guide to help you. The error "Incompatible clusterIDs" occurs when there is a mismatch between the NameNode's and DataNode's cluster IDs. This typically happens if Hadoop is reinstalled or the namenode -format command is run again without resetting the DataNode. To resolve the Incompatible clusterIDs error, you should delete the contents of the DataNode directory before formatting the NameNode.
@nihalsingh6667
@nihalsingh6667 14 дней назад
Sir there is an error showing Usage Error: Unrecognized or legacy configuration settings found: dir - run "yarn config -v" to see the list of settings supported in Yarn (in <environment>) $ yarn run [--inspect] [--inspect-brk] [-T,--top-level] [-B,--binaries-only] [--require #0] <scriptName> ... C:\Windows\System32>
@nihalsingh6667
@nihalsingh6667 14 дней назад
Localhost 8088 is not working
@kundankumar011
@kundankumar011 6 дней назад
Apologies to late reply dear. If localhost:8088 (the YARN ResourceManager web interface) is not working, it typically indicates that YARN is either not running correctly or there may be configuration issues preventing access. Cross Check Yarn Configuration file yarn-site.xml and also check firewall setting if it is not blocked
@فروتي-ي5ش
@فروتي-ي5ش 14 дней назад
locallhost desont work what i shuld do
@kundankumar011
@kundankumar011 14 дней назад
Sorry to know that you are facing a challenge. Localhost is not working. This could be due to various reasons dear. 1) Cross the Hadoop daemons NameNode , Resource Manager is still running or not using jps command while accessing localhost 2) Cross check Hadoop Configuration file is set properly or not. 3) Check the firewall settings. Sometimes firewall blocks some ports like 8080, 9000, 50070 etc. if yes change settings to allow them Please do you subscribe to my channel to encourage me dear and not to miss on upcoming relevant videos. Thank you ❤️
@kundankumar011
@kundankumar011 14 дней назад
@@فروتي-ي5ش Hello dear. Hope you managed to fix the issues. Please don't forget to subscribe to my channel to encourage me
@nshunguyimfuraabou-bakar1000
@nshunguyimfuraabou-bakar1000 15 дней назад
Many thnx!!!
@kundankumar011
@kundankumar011 15 дней назад
@@nshunguyimfuraabou-bakar1000 pleasures ❤️
@samiaahmed-k7n
@samiaahmed-k7n 15 дней назад
when iam about to make flutter new project an error occurs that unable to locate flutter.sdk .how to resolve this
@kundankumar011
@kundankumar011 15 дней назад
@@samiaahmed-k7n Thank you so much for watching this video💗. Sorry to know that you are facing the sdk not found error. To fix these issues make sure that you set the correct environmental variables Path of sdk. Please rewind the video and see how to set environmental variables Path of sdk dear. Please don't forget to encourage me by subscribing to my channel so that you don't miss notifications on upcoming relevant videos.
@amazighkab9904
@amazighkab9904 15 дней назад
when I execute the command "start-all.cmd" and I type the command "jps" at the beginning there are all the daemon (NodeManager, Namenode, ResourceManager, Datanode) after a moment I retype the command "jps" I see only the daemons (RessourceManager and DataNode) is this normal? Thank you
@kundankumar011
@kundankumar011 15 дней назад
@@amazighkab9904 Basically sometimes Namenode and Resource Manager may stop automatically due to lack of space resources. But it should not happen always dear.
@samiaahmed-k7n
@samiaahmed-k7n 16 дней назад
very helpfull video.Thanhyou
@kundankumar011
@kundankumar011 16 дней назад
@samiaahmed-k7n Glad to know that it was helpful to you as well❤ Thank you so much for subscribing my channel
@amazighkab9904
@amazighkab9904 18 дней назад
Thank you bro !! this is the best video that explains how to install HADOOP
@kundankumar011
@kundankumar011 18 дней назад
Thank you so much for the appreciation dear 💗. It motivates me. Hope you subscribed to my channel and press bell 🔔 icon l so that you don't miss notifications on up irrelevant videos
@claudebuhanga4262
@claudebuhanga4262 19 дней назад
Sir i am using win 11 but i get this error"C:\hive\bin>hive --service schematool -dbType derby -initSchema "Missing hadoop installation: C:\hadoop\bin must be set"" when u run on this step "11. To Initialize Hive Schema: a) Navigate to `C:\hive\bin` and run below command: hive --service schematool -dbType derby -initSchema" it is different with the error in your video
@kundankumar011
@kundankumar011 19 дней назад
Sorry to know that you are facing issues. The error Missing hadoop installation: C:\hadoop\bin must be set means that Hive cannot find your Hadoop installation because the environment variable for HADOOP_HOME is not set correctly. Please Set HADOOP_HOME and also Cross Check JAVA_HOME is set well. Follow the Steps in video then cross check what mistakes you did dear. For Hive Services, the errors you may encounter can vary depending on the specific configuration of your computer.
@iranzilionel5090
@iranzilionel5090 21 день назад
Is Larissa a good student
@kundankumar011
@kundankumar011 21 день назад
@iranzilionel5090 Thank you for your comment, dear❤. However, it seems unrelated to this video, as you are asking about another viewer of this video. Larissa? Oh, she is the star student in the class of awesomeness. She is obedient student. She always asks the best questions and brings amazing energy to the group. Keep up the fantastic work, Larissa! 💪🎓
@LariChou-h8o
@LariChou-h8o 21 день назад
Much appreciation to my lecturer
@kundankumar011
@kundankumar011 21 день назад
@LariChou-h8o Thank you so much for your appreciation, dear
@tuyishimes.pcardinard3002
@tuyishimes.pcardinard3002 21 день назад
This video is well elaborated however, by the time am about to finish installing hive, i encounted the following error how do i resolve it? C:\Windows\System32>StartNetworkServer -h 0.0.0.0 'StartNetworkServer' is not recognized as an internal or external command, operable program or batch file.
@kundankumar011
@kundankumar011 21 день назад
Sorry to know that you encountered the error. Looking at the error shows that there are some issues in setting Path. I advise you Cross check the path and if possible again watch the video to see if you have not made any mistakes while setting the path
@tuyishimes.pcardinard3002
@tuyishimes.pcardinard3002 20 дней назад
@@kundankumar011 Thanks have been able to complete the installation.
@kundankumar011
@kundankumar011 20 дней назад
@@tuyishimes.pcardinard3002 Congratulation dear. Bravo
@claudebuhanga4262
@claudebuhanga4262 22 дня назад
it works well in Win 11 all error are not existing
@kundankumar011
@kundankumar011 22 дня назад
Glad to know it's working well dear.
@SugunaDinesh-k1g
@SugunaDinesh-k1g 23 дня назад
Thank u sir really helpful
@kundankumar011
@kundankumar011 23 дня назад
Glad to know that it was very helpful to you as well. Hope you didn't forget to subscribe to receive notification for upcoming relevant videos.
@ghmaniyamaniya7067
@ghmaniyamaniya7067 25 дней назад
Thanks
@kundankumar011
@kundankumar011 25 дней назад
Welcome
@wizardop2100
@wizardop2100 27 дней назад
i Encountered this problem. 'jps' is not recognized as an internal or external command, operable program or batch file can you please provide the solution?
@kundankumar011
@kundankumar011 27 дней назад
Thank you so much for watching my video. The error 'jps' is not recognized as an internal or external command, operable program or batch file typically occurs because the Java Development Kit (JDK) tools (like jps) are not properly set up or the environment variable JAVA_HOME is not configured correctly. Please follow these steps to fix: 1) Check if JDK is Installed: Open a terminal or command prompt and type the following command to check the installed version of Java: java -version 2) Ensure JDK bin is in the PATH Environment Variable is set properly Please you do subscribe my channel to encourage me dear and don't forget to press bell icon to receive notifications on upcoming relevant videos
@kundankumar011
@kundankumar011 27 дней назад
@@wizardop2100 Hello dear, encourage me by subscribing my channel dear.
@bimenyimanajanvier6975
@bimenyimanajanvier6975 28 дней назад
🎉😅
@kundankumar011
@kundankumar011 28 дней назад
😅😅😅😅😅
@aazeftuff8407
@aazeftuff8407 Месяц назад
Sir all the steps were clear but i'm getting this error C:\Users\srika>cd c:\hive\bin c:\hive\bin>hive --service schematool -dbType derby -initSchema SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/hive/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/pyspark/hadoop/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in thread "main" java.lang.UnsupportedOperationException: 'posix:permissions' not supported as initial attribute at sun.nio.fs.WindowsSecurityDescriptor.fromAttribute(WindowsSecurityDescriptor.java:358) at sun.nio.fs.WindowsFileSystemProvider.createDirectory(WindowsFileSystemProvider.java:492) at java.nio.file.Files.createDirectory(Files.java:674) at java.nio.file.TempFileHelper.create(TempFileHelper.java:136) at java.nio.file.TempFileHelper.createTempDirectory(TempFileHelper.java:173) at java.nio.file.Files.createTempDirectory(Files.java:950) at org.apache.hadoop.util.RunJar.run(RunJar.java:296) at org.apache.hadoop.util.RunJar.main(RunJar.java:245) PLEASE HELP!!!
@riteshchavan8308
@riteshchavan8308 Месяц назад
Thanks.
@kundankumar011
@kundankumar011 Месяц назад
@@riteshchavan8308 Pleasure Hope you subscribed my channel to encourage me and not to miss any notifications on upcoming relevant videos.
@dishtisoni3937
@dishtisoni3937 Месяц назад
Hi, Thankyou for the details. It is very helpful!!! However, I am getting an error ""Missing Hive Execution Jar: C:\hive\lib/hive-exec-*.jar"". How to resolve this? Thanks.
@kundankumar011
@kundankumar011 Месяц назад
@@dishtisoni3937 Thank you for watching my video and happy to know that detailing in video was helpful. The error you're encountering suggests that Hive can't find the required hive-exec.jar file. The steps to fix it are below: 1) Check Hive Installation: Ensure that you have installed Hive correctly. The hive-exec.jar file should be in the C:/hive/lib/ directory. If it's missing, you may need to reinstall Hive. 2) Cross check the Environment Variables: - Make sure your HIVE_HOME is set to C:/hive. - Ensure that the CLASSPATH includes C:/hive/lib/*. 3) Configuration Files: Check your hive-site.xml and make sure all configurations are correct and point to valid paths. Let me know if you managed to fix dear? Please you do subscribe my channel if you not yet to encourage me and not to miss notifications on upcoming relevant videos. Keep learning
@hategekimanajmv5527
@hategekimanajmv5527 Месяц назад
hello lecturer give us the video of oop(java)
@kundankumar011
@kundankumar011 Месяц назад
Introduction part is already there on my channel. but soon i will upload the core topics of java.
@guhanplays9493
@guhanplays9493 Месяц назад
Exception in thread "main" java.lang.UnsupportedOperationException: 'posix:permissions' not supported as initial attribute at sun.nio.fs.WindowsSecurityDescriptor.fromAttribute(WindowsSecurityDescriptor.java:358) at sun.nio.fs.WindowsFileSystemProvider.createDirectory(WindowsFileSystemProvider.java:492) at java.nio.file.Files.createDirectory(Files.java:674) at java.nio.file.TempFileHelper.create(TempFileHelper.java:136) at java.nio.file.TempFileHelper.createTempDirectory(TempFileHelper.java:173) at java.nio.file.Files.createTempDirectory(Files.java:950) at org.apache.hadoop.util.RunJar.run(RunJar.java:296) at org.apache.hadoop.util.RunJar.main(RunJar.java:245) how to solve this ,someone help
@guhanplays9493
@guhanplays9493 Месяц назад
Pls help me with this
@mohamedmoawad9670
@mohamedmoawad9670 Месяц назад
you saved my life❤
@kundankumar011
@kundankumar011 Месяц назад
@@mohamedmoawad9670 Glad to know that it was very helpful for you and saved your lots of time. Enjoy your learning dear ❤️ Hope you do subscribe to my channel to encourage me dear and not to miss notifications on upcoming relevant videos
@meghanajyothinagaram
@meghanajyothinagaram Месяц назад
Hi sir can you please tell commands for count the number of words using(wc-w) count number of letters(wc-m)
@kundankumar011
@kundankumar011 Месяц назад
Thank you so mucg for watching my video dear. I apologies for the delay in replying. To count the number of words and letters in a file using Hadoop, you can use the Hadoop streaming utility with wc (word count) commands. C:\Windows\System32>hdfs dfs -cat /user/KUMAR/inputfile.txt | wc -w error: 'wc' is not recognized as an internal or external command, operable program or batch file. if you are being prompt above error please Use the Hadoop Streaming Command with HDFS: Hadoop provides its own way to process files. Instead of piping to wc, you can write a simple Hadoop MapReduce job to count words and letters, but if you want to use the wc command, you would need a Unix-like environment such as Git Bash. After installation, open Git Bash and run the same command you were trying: hdfs dfs -cat /user/KUMAR/inputfile.txt | wc -w Note: Change the Path of file on Hadoop as per your system Let me know if it works. Please you do subscribe my channel to encourage me and not to miss notifications on upcoming relevant videos.
@knightkinglko
@knightkinglko Месяц назад
Hi, I followed ur video amd am stuck at point 12 where we are giving command "Hive" to start hive. Error is java.io.EOFException: End of File Exception between local hostis <> and destination host is <>. Any help here please
@kundankumar011
@kundankumar011 Месяц назад
Thank you for watching my video dear. Sorry for the late replying. It looks like you're encountering an EOF (End of File) exception when trying to start Hive, which can be due to several reasons related to the configuration or communication between nodes in your setup. 1. Check Java Version Compatibility: I am using java1.8, Hadoop 3.2.4 2. Check Hive Configuration Files: Look into the Hive configuration files, especially hive-site.xml if it is configured as shown in video 3. Permissions and Connectivity: Make sure the user running Hive has sufficient permissions for Hadoop and HDFS, and that it can read/write to the necessary directories. Please you do subscribe my channel if not yet to encourage me and not to miss notifications on upcoming relevant videos
@FeelFenix
@FeelFenix Месяц назад
This is an excellent tutorial in order to install hive and the previous one (install hadoop) are very handy. So i need to make a conecction with dbeaver, do you have any video? can you shar it please?
@kundankumar011
@kundankumar011 Месяц назад
@@FeelFenix Thank you for your appreciation of the steps for installing Hadoop and Hive❤️. Regarding your request for a video on connecting Apache Hive with DBeaver, I don’t have one yet, but I’ll try to find time to make it. I hope you’ve subscribed and clicked the bell icon on my channel to stay updated with my upcoming videos.
@cartoonplanet1880
@cartoonplanet1880 Месяц назад
I tried many times to render html page in django. but I can not ...........what is the solution ?
@kundankumar011
@kundankumar011 Месяц назад
Thank you for watching my video! I’m sorry to hear that you're facing issues rendering HTML template files. There could be several reasons for this, but let me guide you through a few important steps that were demonstrated in the video: 1) Create a "templates" folder as guided in the video, and ensure it's placed correctly in your project directory. 2) Update "settings.py" to include the path to your templates directory within the app. Make sure the `TEMPLATES` setting has the correct `'DIRS'` path. 3) Check your HTML file name to ensure it's not typed incorrectly while rendering. If everything seems fine and the issue persists, please share your error logs with me so I can assist you further. Please you do subscribe my channel if not yet to encourage me and not missing any notifications of upcoming relevant videos.
@cartoonplanet1880
@cartoonplanet1880 Месяц назад
​@@kundankumar011I created templates, path, urls, settings as per your instruction in the video..even then i got error "templates doesn't exist ". BTW I subscribed to your channel.... Do you have any social media platforms such as- messenger group,what's app group where I can get your support... Thank you... I am from Bangladesh 🇧🇩🇧🇩
@LariChou-h8o
@LariChou-h8o Месяц назад
Ooooh , very happy to learn with upu lecturer 🫂
@kundankumar011
@kundankumar011 Месяц назад
Pleasure to know that you are happy learning from me. It motivates me to research more and more.
@glavanya9998
@glavanya9998 Месяц назад
At first when i installed everything is fine and all the environment varibales are correct but now im facing issue with jps after entering start-all.cmd also im getting only jps no namenode,datanode,resource manager,nodemanager can you pls help sir
@kundankumar011
@kundankumar011 Месяц назад
Thank you so much for watching this video and happy to know that you had successfully configured it. But unfortunately it stopped working. As you are saying all environmental variables are set well. Then to fix these issues you must check the log files to check what causes an error. Review the logs for any errors or issues with starting the daemons. The logs are typically located in $HADOOP_HOME/logs. Please do you subscribe to my channel if you have not yet to encourage me and not to miss notifications on upcoming relevant videos.
@ASHWINIM-v2q
@ASHWINIM-v2q Месяц назад
Thanks a lot for detailed explanation and reference links
@kundankumar011
@kundankumar011 Месяц назад
Pleasure to know that you liked the way of explanation and references links. Please you do subscribe my channel to encourage me and not to miss notifications on upcoming relevant videos.
@ASHWINIM-v2q
@ASHWINIM-v2q Месяц назад
Thanks for the steps
@kundankumar011
@kundankumar011 Месяц назад
Pleasure dear
@sivashankar47
@sivashankar47 Месяц назад
Very informative and helpful.
@kundankumar011
@kundankumar011 Месяц назад
Glad to know that this video was very informative to you as well. Please you do subscribe my channel to encourage me and not to miss notifications on upcoming relevant videos.
@kuruvamaddileti1307
@kuruvamaddileti1307 Месяц назад
Hi Could you please give the create hive table command based on the file stored in hdfs
@kuruvamaddileti1307
@kuruvamaddileti1307 Месяц назад
I used CREATE EXTERNAL TABLE retail_data (record_no STRING,invoice_no STRING,stockcode STRING,description STRING,quantity INT,invoicedate STRING,price DOUBLE,customer_id STRING,country STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION '/data_BDSAssignmet/'; Where my CSV file is stored in data_BDSAssignment dir in hdfs Above commands created table but it did not load any data I am getting below output when I select * hive> select * from retail_data; 2024-09-08T21:38:14,830 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 2f59663d-4a5c-4cde-9103-7e72ed85a913 2024-09-08T21:38:14,831 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to 2f59663d-4a5c-4cde-9103-7e72ed85a913 main 2024-09-08T21:38:15,464 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.common.FileUtils - Creating directory if it doesn't exist: hdfs://localhost:9000/tmp/hive/k.Prachetha/2f59663d-4a5c-4cde-9103-7e72ed85a913/hive_2024-09-08_21-38-14_855_3060643932666893220-1/-mr-10001/.hive-staging_hive_2024-09-08_21-38-14_855_3060643932666893220-1 OK Time taken: 0.696 seconds 2024-09-08T21:38:15,591 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 2f59663d-4a5c-4cde-9103-7e72ed85a913 2024-09-08T21:38:15,591 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.ql.session.SessionState - Resetting thread name to main
@kuruvamaddileti1307
@kuruvamaddileti1307 Месяц назад
hive> select * from retail_data; 2024-09-08T21:38:14,830 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 2f59663d-4a5c-4cde-9103-7e72ed85a913 2024-09-08T21:38:14,831 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to 2f59663d-4a5c-4cde-9103-7e72ed85a913 main 2024-09-08T21:38:15,464 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.common.FileUtils - Creating directory if it doesn't exist: hdfs://localhost:9000/tmp/hive/k.Prachetha/2f59663d-4a5c-4cde-9103-7e72ed85a913/hive_2024-09-08_21-38-14_855_3060643932666893220-1/-mr-10001/.hive-staging_hive_2024-09-08_21-38-14_855_3060643932666893220-1 OK Time taken: 0.696 seconds 2024-09-08T21:38:15,591 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 2f59663d-4a5c-4cde-9103-7e72ed85a913 2024-09-08T21:38:15,591 INFO [2f59663d-4a5c-4cde-9103-7e72ed85a913 main] org.apache.hadoop.hive.ql.session.SessionState - Resetting thread name to main
@kundankumar011
@kundankumar011 Месяц назад
Thank you so much for watching my video dear. I will try to share it soon or I can make a video and upload it dear. Please do you subscribe to my channel if you have not yet to encourage me and not to miss notifications on upcoming relevant videos
@sanoberfarooqui1048
@sanoberfarooqui1048 Месяц назад
shutting down datanode Error: 2024-09-07 15:28:30,891 INFO common.Storage: Lock on C:\tmp\hadoop-pc\dfs\data\in_use.lock acquired by nodename 9528@DESKTOP-TNSAH2H 2024-09-07 15:28:30,891 WARN common.Storage: Failed to add storage directory [DISK]file:/C:/tmp/hadoop-pc/dfs/data java.io.IOException: Incompatible clusterIDs in C:\tmp\hadoop-pc\dfs\data: namenode clusterID = CID-6679ba00-f42f-4835-8440-3f7fe4748513; datanode clusterID = CID-13c1262e-453e-41e2-ac80-0542976931df
@kundankumar011
@kundankumar011 Месяц назад
Thank you so much for watching my video but sorry to delay in replying. If still you are facing issues then After looking at The error you are encountering indicates a mismatch between the ClusterID of the NameNode and DataNode. One for the Steps to Fix: Clear the DataNode Data Directory If you're okay with clearing the DataNode's data and starting fresh, you can delete the DataNode storage directory and restart the DataNode. This will cause the DataNode to re-register with the NameNode using the current ClusterID. Then reformat the namenode and restart the daemons hope this will help you. otherwise let me know dear.
@sivashankar47
@sivashankar47 Месяц назад
Thank you, followed the steps and it worked as expected.
@kundankumar011
@kundankumar011 Месяц назад
@@sivashankar47 Glad to know that it is working for you as expected ❤️ Keep learning and enjoy. Please do you subscribe to my channel to encourage me and not to miss any notification on upcoming relevant videos.
@only_voice_of_tamizhi
@only_voice_of_tamizhi Месяц назад
wat should be the java version for this hive to run?>
@kundankumar011
@kundankumar011 Месяц назад
@@only_voice_of_tamizhiThank you so much for watching this video. I have used Jdk8 and Apache Hadoop 3.2.4 for this hive. Thank you so much for subscribing to my channel ❤️
@MOHITSINGH-ye3uh
@MOHITSINGH-ye3uh Месяц назад
Thank you so much sir .
@kundankumar011
@kundankumar011 Месяц назад
@@MOHITSINGH-ye3uh Pleasure dear ❤️ Hope you do subscribed my channel to encourage me and not to miss notifications on upcoming videos.
@MohammedSalman-nb2pu
@MohammedSalman-nb2pu Месяц назад
Yes, Really the video is very useful after seeing multiple different different videos I got the correct answer here only
@kundankumar011
@kundankumar011 Месяц назад
@@MohammedSalman-nb2pu I am glad to know that this video was very helpful for and got your answers here❤️. Please do you subscribe to my channel to encourage me dear and to get notifications for upcoming relevant videos.
@drm9514
@drm9514 Месяц назад
sir, Issue while dealing this - wget -r -np -nH --cut-dirs=3 -R index.html svn.apache.org/repos/asf/hive/trunk/bin/ 'wget' is not recognized as an internal or external command, operable program or batch file.
@RiteshSingh-zb5ss
@RiteshSingh-zb5ss 2 месяца назад
sir localhost cluster is not opening
@kundankumar011
@kundankumar011 2 месяца назад
@@RiteshSingh-zb5ss Thank you so much for watching this video dear and happy to know that you managed to configure it successfully. Try to use localhost:50070/ Let me know you can open it. Even this link is given in the video description dear. Please do you subscribe to my channel to encourage me.
@RiteshSingh-zb5ss
@RiteshSingh-zb5ss 2 месяца назад
​@@kundankumar011 sir it is not opening , other two site is opening but cluster site is not opening
@kundankumar011
@kundankumar011 2 месяца назад
@@RiteshSingh-zb5ss That's good progress. What error is promoting in the web browser on opening of the cluster? Please do you subscribe to encourage me dear.
@RiteshSingh-zb5ss
@RiteshSingh-zb5ss 2 месяца назад
​@@kundankumar011 it is showing site can't be reached and in the jps it is showing 1856 namenode ,5248 datanode ,10228 jps only
@RiteshSingh-zb5ss
@RiteshSingh-zb5ss Месяц назад
​@@kundankumar011 localhost is not working for cluster , but it is working for other datanode etc.
@sanoberfarooqui1048
@sanoberfarooqui1048 2 месяца назад
Datanode not working
@kundankumar011
@kundankumar011 2 месяца назад
Thank you so much for watching this video. If the Hadoop DataNode is not working or not starting, it could be due to several reasons, including misconfiguration, resource issues, or problems with the underlying storage. To troubleshoot and fix the issue, please check the configuration "hdfs-site.xml" files datanode path is specified correctly. Addition to this also check if you have sufficient space. You can also try to restart the computer and try to start all daemons and see if it worked. If possible, please do subscribe my channel to encourage me.
@arohgahane7364
@arohgahane7364 2 месяца назад
C:\Hive\bin>hive --service schematool -dbType derby -initSchema it showing "Missing hadoop installation: C:\Hadoop\bin must be set" How can I do it
@kundankumar011
@kundankumar011 2 месяца назад
Thank you so much for watching my video dear. Sorry to know that you are facing issues with the Hadoop installation. Ensure that you have configured the Hadoop installation properly before Hive installation. Set the HADOOP_HOME environmental variable . If not there is another video on my channel to install Hadoop on windows. You can watch that dear. Let me know if it helped you. Thank you so much for subscribing to the channel.
@WalidAmehri
@WalidAmehri 2 месяца назад
Hi, Thank you so much for the great effort you put into this tutorial, it is better organized than the Hadoop one. The steps are straightforward and the explanations are clear. One point that was critical for me was the version compatibility between Hadoop and Hive, in fact, in my first installation I missed that information, and then I re-checked and found that the Hive version that I previously installed was not compatible with my Hadoop version. So I repeated the complete tutorial to ensure the Hive version is compatible with the Hadoop version. Can you please do a video on how to install Cassandra on Windows, just like you did here for Hive? Thanks again
@tonyobikwelu9783
@tonyobikwelu9783 2 месяца назад
God bless you teacher for your help
@kundankumar011
@kundankumar011 2 месяца назад
@@tonyobikwelu9783 The pleasure is mine dear ❤️ I hope you’ve subscribed to my channel to encourage me and receive notifications about upcoming videos.
@DEVO69
@DEVO69 2 месяца назад
thanks dada for helping
@kundankumar011
@kundankumar011 2 месяца назад
I am glad to know Dada that it was helpful ❤️ and thank you so much for subscribing my channel
@SakshiGajjar-u3i
@SakshiGajjar-u3i 2 месяца назад
Nodemanager and resourcemanager shutting immediately jps shows only 2820 Jps 10328 NameNode 21484 DataNode Pls help
@kundankumar011
@kundankumar011 2 месяца назад
Thank you so much for watching this video❤️ Sorry to know that you are facing issues. If the NodeManager and ResourceManager are shutting down immediately after starting, it typically indicates a configuration issue. Please Validate Configuration Files: Double-check the following configuration files for any misconfigurations: "yarn-site.xml" "core-site.xml" Ensure that the configuration for the yarn.resourcemanager.hostname and other essential parameters is correct. If these all are ok then there could be issues of insufficient space. Let me know if this helped you dear. Please do subscribe to my channel and encourage me
@GlobalTalesChronicles
@GlobalTalesChronicles 2 месяца назад
Hii sir followed every step you adviced ...but after running hdfs namenode -format I'm getting error ...big is not recognised and classpatch is not recognised
@kundankumar011
@kundankumar011 2 месяца назад
@@GlobalTalesChronicles Thank you so much for watching this video dear. Sorry to know that you are promoted errors while formatting namennode. What is a "big" word here? Can you copy paste the whole error?
@HarshitaChandwani-x3i
@HarshitaChandwani-x3i 2 месяца назад
I am facing this error: C:\hive\apache-hive-4.0.0-bin>hive --service schematool -dbType derby -initSchema Exception in thread "main" java.lang.UnsupportedOperationException: 'posix:permissions' not supported as initial attribute at sun.nio.fs.WindowsSecurityDescriptor.fromAttribute(WindowsSecurityDescriptor.java:358) at sun.nio.fs.WindowsFileSystemProvider.createDirectory(WindowsFileSystemProvider.java:496) at java.nio.file.Files.createDirectory(Files.java:674) at java.nio.file.TempFileHelper.create(TempFileHelper.java:136) at java.nio.file.TempFileHelper.createTempDirectory(TempFileHelper.java:173) at java.nio.file.Files.createTempDirectory(Files.java:950) at org.apache.hadoop.util.RunJar.run(RunJar.java:296) at org.apache.hadoop.util.RunJar.main(RunJar.java:245) can anyone help ?