Тёмный

Summarize PDF Docs & Extract Information with AI & R | Step-By-Step Tutorial 

Albert Rapp
Подписаться 3,8 тыс.
Просмотров 1,8 тыс.
50% 1

DESCRIPTION AND CODE
With {tidychatmodels} it's pretty easy to set up a chat flow to use AI for summarizing PDFs and extracting key information from it. In this video, I show you how that works. You can find the corresponding blog post at albert-rapp.de...
📈 CREATE EFFECTIVE CHARTS
Check out my video course to create insightful data visualizations with ggplot at arapp.thinkifi...
MORE VIDEOS
📺 Avoid duplicate R code in 150 seconds • Avoid duplicate code w...
📺 Shiny modules in 100 seconds • Shiny Modules in 100 S...
📺 Fast explainer playlist • Explainer videos
Subscribe at 👉 / @rappa753
MORE CONTENT
- weekly 3-minute newsletter about R, DataViz and webdev at 3mw.albert-rap...
- LinkedIn at / dr-albert-rapp-9a5b9b28b
#rstats #dataviz #ggplot #dplyr

Опубликовано:

 

17 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 17   
@rappa753
@rappa753 3 месяца назад
If you enjoyed this video and want to level up your R skills even further, check out my latest video courses: 📍Data Cleaning Master Class at data-cleaning.albert-rapp.de/ 📍Insightful Data Visualizations for "Uncreative" R Users at arapp.thinkific.com/courses/insightful-data-visualizations-for-uncreative-r-users
@jysushdz7826
@jysushdz7826 6 месяцев назад
THANKS Albert for share this amazing tutorial
@rappa753
@rappa753 6 месяцев назад
You're very welcome :)
@AlexLabuda
@AlexLabuda 6 месяцев назад
Very nice tutorial, thanks for sharing. Excited to check it out!
@jimmahgee
@jimmahgee 5 месяцев назад
This is super, Albert. At work we have a load of complex documents that get human processed each year. For a hackathon we’re going to see if we can pre-process using LLMs. So this is a super useful starting point for my attempt when the time comes!
@rappa753
@rappa753 5 месяцев назад
Nice, best if luck with the hackathon! Let me know how it went 🤗
@jiazhang803
@jiazhang803 6 месяцев назад
Excellent! Thanks Albert! May I ask how you pasted different pieces of code so quickly during the tutorial? Looks like magic to me🙈
@rappa753
@rappa753 6 месяцев назад
Thank you 😊 Throw in the whole code chunk and remove it in reverse order. Then just do ctrl-z 😀
@jiazhang803
@jiazhang803 5 месяцев назад
Haha good trick! thanks@@rappa753
@olexiypukhov-KT
@olexiypukhov-KT 2 месяца назад
I find it easier to just use httr, call the api, and map myself. But to each their own.
@rappa753
@rappa753 2 месяца назад
I can relate. People who already know how to work with APIs might feel like it's more overhead. How do you feel about the {httr2} package? I feel like it has a much nicer syntax than {httr}.
@bubaptak2294
@bubaptak2294 10 дней назад
Thank you for this wonderful video! 😊 But what about data protection and data security? Sensitive data from PDF is forwarded directly to Chat Gpt, correct? (Sorry for the question, I'm just a beginner with R 😬)
@rappa753
@rappa753 9 дней назад
Glad that you like it. You could use a local LLM with the ollama vendor. In the future I want to also add support for Azure. That way, you can host private instances of ChatGPT. That's how a lot of companies use ChatGPT
@bubaptak2294
@bubaptak2294 8 дней назад
@@rappa753 Great, thank you 👌🏻
@hanshellsmark8430
@hanshellsmark8430 3 месяца назад
Awsome package! Is rate limits handled in the call to the different models? If not, the map function will be difficult to use on larger use cases, right?
@rappa753
@rappa753 3 месяца назад
Glad that you like it. Rate limits are not yet handled. So you might have to go with a classic for loop and put in a bit of pause between iterations
@listonjohnson3408
@listonjohnson3408 2 месяца назад
Has anyone set this up using local models and Ollama negating the need for api keys?
Далее
Добрая весть 😂
00:21
Просмотров 398 тыс.
How to Make Great Tables with R | Step-by-Step Tutorial
22:14
AI can't cross this line and we don't know why.
24:07
Просмотров 471 тыс.
Python Vs R (funny!)
3:23
Просмотров 129 тыс.