@@NicholasRenotte Hey, pls do a detailed video on Beautiful Soup. Also my request is pls make a video on other NLP usecases - text summarization, topic modeling, NER, text generation, Q&A with pretrained models (BERT, ROBERTA, GPT and so on)
brother I wanna say that your videos are very easy to understand and along the way to follow practically in our computer please make complete course on NLP and OPENCV And Machine learning because you are very good teacher I very like you god give you long life with a lot of happiness
amazing video! I mixed the content from your code with a csv file that I got from extracting tweets and now I can do a sentiment analysis on so many subjects! Thank you!!
Awesome video! If its to the extend of your knowledge, a subsequent video about fine-tuning a pre-trained model to a particular dataset would be very interesting. So train, test and validate the accuracy of a model pre- and post-training. I have read some articles which showed accuracy of a model can be improved significantly when trained on the particular dataset.
Definitely, plus the models become a lot more practical when tuned to your use case. Definitely will have a series of tutorials or one mega tutorial coming on it soon @Michael!
@@NicholasRenotte did you make a tutorial about this issue already ? I couldn't like your video more,, it's simple, informative, short and just straight to the point :) Thank you
Hey Nik! I've been looking left and right for a guide on how to interpret the logits from BERT and thank god you've uploaded something that explains it well! I liked and subscribed. (Maybe you could refer me to a source, where you happen to learn it? - looking one for the FinBert variant) Also, your crash course on that scraping part is straight on point and I love that you executed each line from it to show you how it gets transformed. You are truly blessed with an out-of-this-world tutoring skill!
Thanks sooo much @Capt_Kaplan! I'm kinda just doing a lot of googling to learn it atm, there's heaps of examples of custom models in the HF model repo though, check this out: huggingface.co/ProsusAI/finbert
Very good pal. You keep it simple and at high level. I would be happy to see those two negative votes to elaborate further why they are not satisfied though..
Hello Nicholas Reynolds, I am a college student, I like your video very much, I keep learning, you are my half teacher, thank you very much, look forward to more videos
Great vid! Not one of the more popular requests here but would be interested on a deep dive in building scrapers. It's kind of a niche topic since it's not the most exciting part of ML or Data Science but at least I get to throw my hat into the table. Haha. Looking forward to more vids =)
Love this video! You have an amazing skill in teaching others and make it look so simple. I'm subscribing you and I guess on my journey learning python I will watch a lot of your videos. Thanks a lot!
Hmm this is helpful for reviews but not for pos/neu/neg sentiment analysis, which is what I was actually trying to get some examples of regarding score interpretation.
Fantastic explanation on all videos. just wondering whether you can do a video on sktime - installation, introduction, processes , etc. Hope to hear from you. cheerio
Awesome video! But, I was wondering if text preprocessing (removing punctuations, stopwords, lemmatization, etc) is needed before encoding the reviews? Appreciated if can get an answer
Hi Nicholas, thank you for the video! It's really helpful for my work. Also, I have a question that is it a must to do text pre-processing by applying NLP model using deep learning? because it seems like you didn't apply any text preprocessing method for the raw text you've scraped.
For these models they're able to handle raw unprocessed input as they have a pre-processing pipeline built in @lowwwleechi. If i got sucky results I might add some additional pre-proc to improve the results!
Hey Nicholas, is there any way of changing the bert sentiment score from 1-5 to a floating value between 0-1, similar to for example the vader sentiment score? Or do I need another bert sentiment model for this? Thanks in advance :)
Love mito but had them on a zoom call - it does not work with streamlit app. Perhaps it does not work with any app outside of a Jupyter notebook not sure 🤷♂️
Hi Nichola, thank you for this video. Good information 👍 I have a doubt, can it handle sarcastic reviews? Also don't we remove shop words from text data?
Another great tutorial thanks man Have one question in this video we are able to extract reviews from page 1 only so how to extract reviews from all the pages? Thanks once agian
Thank you so much. Your video is so helpful. Is it possible to make another one about how to do sentiment analysis with BERTweet for covid -19 tweets? Thanks again.
Hello Nicholas, Is it recommend to code all the machine learning algorithms from scratch so that I can learn math behind it or just understand and start to code?
I don't personally think so, I think it's more important to know how to use them as a Data Scientist. If you're looking to get into research however you would probably need a deeper understanding however!
instead of using a lambda function for adding the review scores to the dataframe, would it be equally as good to use list comprehension to get a list of the scores then creating a new column out of that in the dataframe?
Can you make one on mT5 or m2m100 for language translation models and how they can be trained from huggingface. Would be super helpful for techies across countries where content is limited.
Multi-langual and Dutch too... A double Heineken for u🍻 Maybe another vid of text classification for different classes/categories based on Bert ( Bert-base, multilangual cased) Not specific sentiment but categorizing sentences/text into subjects like cars, beers,soccer...
You got it, actually started work on a mega NLP tutorial. Zero-Shot Classification would be perfect for that!! Can't wait to drink that Heineken this weekend, still hoping Hamilton beats Verstappen though 😂.
@@NicholasRenotte Looking forward👍 Until this new season the F1 races were dominated by Mercedes. It is like running tensorflow with gpu versus tpu 😁 If Verstappen or other talented drivers got the same racing car as Hamilton then the competition would be more interesting .... But of course I hope Verstappen will win!
@@NicholasRenotte Yes the anomaly detetection algroritjm of Red Bull could be improved and I think Hamilton has not trained with the rigt AI Gym RL algorithm 😎
I have a problem: RuntimeError: Failed to import transformers.models.bert.modeling_bert because of the following error (look up to see its traceback): No module named 'torch.utils'
Is there a bert multingual up to date? This program you are using, uses old versions of numpy... between others. If there is an up to date version, it would be amazing to know, where to find it, thanks!
Hey Nicholas Great Video!!! I was wondering if I could change the url in the code and give the class name of another website's comments other than YELP will it still work...
I am stuck as i want to add an input which will be predicted. So basically i want to add an input ( a user would write a line for which the sentiment should be predicted) but i am unable to write the code for it. Can anyone please help me.
How would I fine tune this because right now im getting some inaccurate reviews. For some some reviews it is giving me 3 when it should be 4 stars or it is giving 5
Hi, I'm just learning NLP and it's totally new to me. Just want to know how to know how tokens / text can be passed for particular nlp pipeline? Thanks in advance
Do you currently have a video that covers how to do this? ->> "Can't be bothered building a model from scratch?". i would like to learn how to build or fine tune BERT. thanks
Hello Nicholas, many thanks for the vidoe, how can one implement this code in tensorflow or what advantage pytorch has over tensorflow in Bert implementation?
Nice video.. easy to understand many new term.... but I cant understand line 4, why we use this particular URL? And use them with data...how that useful?... Please reply