Тёмный
Hackpens
Hackpens
Hackpens
Подписаться
Комментарии
@firozshaikh3739
@firozshaikh3739 8 дней назад
hi do you know how to copy log file from cowrie honeypot is on?
@derekberthiaume5367
@derekberthiaume5367 11 дней назад
If I wanted to count the number of times that each unique instance showed up. What would I do for that? Would I do the unique and then do the word count for each instance by using grep for that specific phrase?
@quarylaniel
@quarylaniel 28 дней назад
REALLY HELPED THANK YOU SO MUCH
@Monana666
@Monana666 Месяц назад
this is exactly what I was looking for and even more! thank you so much!
@mahendra.l861
@mahendra.l861 Месяц назад
Without changing directory how can we do
@mahendra.l861
@mahendra.l861 Месяц назад
I don't want each line content just displaying the what are log files present in all other sub directoties also
@guths
@guths 2 месяца назад
awesome video
@potatochannel1948
@potatochannel1948 2 месяца назад
this one is one of the most helpful tutorials out there that show how powerful grep and pipe are. Thanks for sharing that and I hope you make more cool stuff.
@learningbd5306
@learningbd5306 3 месяца назад
Thanks
@dodokwak
@dodokwak 3 месяца назад
You could configure fail2ban not only for sshd but also for nginx requests to catch 400-404 errors.
@dodokwak
@dodokwak 3 месяца назад
Thx. Very helpful.
@AbdoTawdy
@AbdoTawdy 4 месяца назад
For compressed files, zcat zgrep
@TrendyTales-ep9yq
@TrendyTales-ep9yq 4 месяца назад
sir can we use awk instead of cut?
@makopafruit
@makopafruit 5 месяцев назад
Thank you!
@gingerfication3375
@gingerfication3375 5 месяцев назад
Thank you
@heli0s359
@heli0s359 6 месяцев назад
genius
@bonatate1457
@bonatate1457 7 месяцев назад
Beautiful. absolutely beautiful.
@PEDERSTEENBERG-d5h
@PEDERSTEENBERG-d5h 7 месяцев назад
hOW CAN I SEE ALL FILES ON HARD DRIVE OR USB ? AND HOW COULD DECRYPTED FILES BE ERASED OR OVERWRITE WITH SUDO SHRED ?
@lawman2112
@lawman2112 7 месяцев назад
./pingy: line 15: $OCTETS.txt: ambiguous redirect Any ideas ?? I followed step by step - syntax is exactly identical
@kshitijayamgar1954
@kshitijayamgar1954 7 месяцев назад
thank you so much 😃🙏
@ВиталийОвчаренко-т7й
@ВиталийОвчаренко-т7й 8 месяцев назад
To filter .log files using cat, grep, cut, sort, and uniq commands, follow these steps: 1. First, open your terminal or command prompt. 2. Navigate to the directory containing the .log files you want to filter. You can use the 'cd' command followed by the directory path. For example: ```bash cd /path/to/your/log/files ``` 3. Use the 'cat' command to concatenate and display the contents of a .log file. For instance: ```bash cat your_log_file.log ``` 4. To search for specific lines in the .log file, use the 'grep' command. For example, if you want to find all lines containing the word 'error', you can use: ```bash grep 'error' your_log_file.log ``` 5. If you want to extract specific columns from the output, use the 'cut' command. The format is 'cut -d delimiter -f fields'. For example, if your log file has columns separated by a space and you want to extract the first column, use: ```bash cut -d ' ' -f1 ``` 6. To sort the lines alphabetically or numerically, use the 'sort' command. For example: ```bash sort your_log_file.log ``` 7. Finally, to remove duplicate lines from the sorted output, use the 'uniq' command. For example: ```bash uniq your_log_file.log ``` By combining these commands, you can create a pipeline to filter .log files effectively. For instance: ```bash cat your_log_file.log | grep 'error' | cut -d ' ' -f1 | sort | uniq ``` This command will display unique first columns from lines containing the word 'error' in your_log_file.log.
@cainanashton
@cainanashton 8 месяцев назад
Good vid, thank you
@messileo919
@messileo919 9 месяцев назад
Thankyou this video was exactly what i needed
@MAX-nv6yj
@MAX-nv6yj 10 месяцев назад
thanks for the amazing video love it <3
@loopydooeu9397
@loopydooeu9397 10 месяцев назад
as a complete beginner, this video really helps. Thanks a lot!!!
@mrmotofy
@mrmotofy 11 месяцев назад
A great simple illustration
@gustavotobias7681
@gustavotobias7681 11 месяцев назад
Muy buen video, gracias por compartir, saludos desde México
@FarizK-g7z
@FarizK-g7z 11 месяцев назад
@Hackpens hope you are doing well , amazing videos full of information , can not find in hours of traning videos , pleaase if you create more here community is waiting !!!!!!!
@beyremrjeybi9978
@beyremrjeybi9978 11 месяцев назад
nice , except cut -d " " -f x not working for me , i will dig durther to figure out why..
@genghismike6186
@genghismike6186 Год назад
Thanks.. very helpful and will be using this as a reference from now on
@bramkesseler1582
@bramkesseler1582 Год назад
6:36 someone tried Minecraft lol
@erbenton07
@erbenton07 Год назад
you don't need cat, just use grep "string" auth.log also, you instead of cut, just use awk '{print $11}'
@EdHatesNoobTubers
@EdHatesNoobTubers Год назад
😊 great videos 👍 thank you!!!
@ibrahimbadsha6741
@ibrahimbadsha6741 Год назад
Your explanation is better than more other thank you so much sir
@averagetechnologyenojyer
@averagetechnologyenojyer Год назад
Keep making these super cool videos, they're such a lifesaver
@paaao
@paaao Год назад
Great video showing the power of the built in command line tools. Remember, the command line (and chosen shell, ie; bash) interact directly with the kernel. Control your hardware directly from your keyboard rather than depending on gui interpreters that stand between you and the kernel like other operating systems.
@paaao
@paaao Год назад
Now dump all the unique IPs into a text file, and run nslookup on each one. $50 says they all are located in China or Russia. At least %98-99 of them. At least that's what I always end up finding.
@0xpurn
@0xpurn Год назад
Quick revision: #cat auth.log | grep "invalid" | cut -d " " -f 11 | sort | uniq | wc -l #cat fail2ban.log | grep "Ban" | grep -v "Restore" | cut -d " " -f 16 | sort | uniq -d > ~/uniq_ips.txt
@4EntertainmentOnly
@4EntertainmentOnly Год назад
Awesome.
@yosefberger6259
@yosefberger6259 Год назад
Great introduction to the topic, a few things that i think are worth mentioning, once people have learned the commands that were being demonstrated: If the logs your using have a variable amount of spaces between columns (to make things look nice), that can mess up using cut, to get around that you can use `sed 's/ */ /g` to replace any n spaces in a row with a single space. You can also use awk to replace the sed/cut combo, but that's a whole different topic. uniq also has the extremely useful -c flag which will add a count of how many instances of each item there were. And as an aside if people wanted to cut down on the number of commands used you can do things like `grep expression filepath` or `sort -u` (on a new enough system), but in the context of this video it is probably better that people learn about the existence of the stand alone utilities, which can be more versatile. Once you're confident in using the tools mentioned in the video, but you still find that you need more granularity than the grep/grep -v combo, you can use globbing, which involves special characters that represent concepts like "the start of a line"(^) or the wildcard "any thing"(*) (for example `grep "^Hello*World"` means any line that starts with Hello, and at some point also contains World, with anything or nothing in-between/after). If that still isn't enough you might want to look into using regular expressions with grep, but they can be even harder to wrap your mind around if you've never used them before. (If you don't understand globbing or re really are just from reading this that's fine, I'm just trying to give you the right terms to Google, because once you know something's name it becomes infinitely easier to find resources on them)
@msnraju97
@msnraju97 Год назад
I am checking this video 3year after upload. The video tutorial is on point and clear.
@fredflintstone505
@fredflintstone505 Год назад
Thanks! That was informative. The only thing I would have done differently is flip the order of uniq -d and sort. Less items to sort after uniq filters them out.
@sdk5611
@sdk5611 Год назад
Great Stuff with simple explaination
@MohamedAli-dx2fp
@MohamedAli-dx2fp Год назад
Can't be more appreciative.thanks man 👏
@a.m.karthick629
@a.m.karthick629 Год назад
Sir, wonderful explanation. Kindly do more real-time videos on Linux Bash scripting, it will be very helpful for me and people like me who are trying to get into Bash scripting just by pressure from the higher management to finish an automation task :):)
@ansellroman6620
@ansellroman6620 Год назад
from the ip addres can you find out their location ?
@hackpens2246
@hackpens2246 Год назад
If the user isn't using a VPN service, then yes (an approximate location) using a publicly available tool, like whatismyipaddress.com/ip-lookup
@Esmeralda-bq7ev
@Esmeralda-bq7ev Год назад
Thank you
@hackpens2246
@hackpens2246 Год назад
You're welcome :)
@geetabasker7127
@geetabasker7127 Год назад
Concept explained well in a short video.
@richardazu7445
@richardazu7445 Год назад
Simple and straightforward ❤
@brahimayoada2657
@brahimayoada2657 Год назад
Well done
@yash1152
@yash1152 Год назад
From description: > _"I show you how to filter information from a .log file, and you find out just how important strong passwords really are."_ i always wondered that pattern matching has smth to do with password security, but then i thought, you have to have passwords to apply pattern matching on 'em right? 'cz the password input field of a site doesn't accept regex, and generating exhaustive strings from regex doesn't help either... so, what are scenario we are imagining for talking about regex in context of secure passwords?