Besides the range... I see no major advantage. Sounds like a very expensive and overly complicated solution to a problem that can be just as well solved with a shotgun. The disadvantage here is the anti drone device needs to be pointed at the drone for some time - and that leaves soldier carrying it exposed. The device is also large, quite heavy and cumbersome. So... If you have a luxury of spotting the drone from 5km away, be my guest, this might be working. But small suicide or grenade dropping drones are ambush weapons - they can be started quickly, and engage from concealed positions - bushes, hills, treelines, building corners and at far shorter distance. It is not like any semi-decent drone operator will fly the thing straight on from the most obvious angle. Soldiers scanning their environment now have to consider the entire hemisphere of line of sight, which also limits their awareness of other threats.
Thank you, I always wanted to know how this works. I didn’t know you could convert letters to binary numbers that sparked a whole new inspiration for me to create a code in the story I’m writing . Thank you ❤
Lol, using 3,5" discs to explain computers without a harddrive. Back in the day that computers came with floppy disks without harddisk, they were the (more fragile) 5,25" floppy disks. The name floppy disk is because it could flop (bend) easy. 3,5" disks came when harddisks were mainstream standard already
The 32/64 bit architecture is more than just about memory addressing. The registers of a 64 bit CPU are 64 bits wide which means the CPU can perform 64 bit arithmetic and logical operations where the 32 bit CPU is limited to 32 bit native operations. This makes the 64 bit CPU faster than a 32 bit CPU.
Using C: for boot HD isn't a big deal. Also, many computers from the pre-HD era had just one floppy. You'd put in your boot disk and take it out when boot was done. MS DOS was small enough to completely reside in the memory once loaded. Even two floppy drives was a luxury.
I remember the ribbon cables was twisted so it would be A drive at the end of the cable, and B drive in the middle. Honestly it does not matter what letters are used, but it is nice to have C: for the OS disk on every computer.
Floppy drives used different connectors than the hard drives these connectors are sometimes not even on newer hardware. Early machines could support up to 4 floppies so many times your hard drive was labeled D E or F . When hard drives became standard in computers the floppies where still used to enter any programming to the hard drive so A and B where reserved for floppies and C was the start of the hard drives. with the introduction of CD roms the floppies lost there usefulness and disappeared from modern machines. If your modern machine dose still have floppy support you will only be able to connect one through a DRVSB connector that routes it through the serial port bus.
I started using and programming computers when the drives used 5-1/4" actual "floppy" disks. I think their capacity was somewhere around 760k! I remember renting an external 20MB hard drive that took up 1/2 sq.ft. of desk and weighed 10 pounds for my software business because buying one was more than budget would allow. Monitors were huge boxes with 14" monochrome displays. The World-Wide Web was only accessible through an acoustic modem and a dial phone, and download speeds were a blazing 720KBPS. Now my laptops have 1 terabyte SSDs +500GB hard drives in them. That's 75,000 times the capacity of the external HD, They the size of a stack of ten playing cards and weigh 1 ounce. The machines are connected to the internet at >100MBPS 24/7. That's how ancient I am.
@@techindex1 The amazing thing to me is that electronics get smaller, faster, better quality, and less expensive as technology permits. Everything else seems to be headed in the exact opposite direction. Sure, cars are safer, start and run better with less maintenance, have better fuel economy, but they cost 10-20 times as much as cars of 50 years ago and major repairs are astronomically expensive. Generally, household goods are made from cheaper materials, and consumables like hygiene, food, paper products, clothing,gas and electric energy are all many time more expensive than even a generation ago. While average household income has increased dramatically, inflation has far out-paced our disposable income.
Since it's what the standard Windows install uses, I think we should just leave the C drive as is. Even if we ditched drive letters and named each drive (like Linux uses), that would probably still cause confusion with most computer users.
No discussion of the 20th pin issues on DisplayPort. DisplayPort is superior but HDMI is more fool proof. No idea what I’m talking about with this 20th pin issues? Let’s just say not all DisplayPort cables are created equal.
I believe this is a Trojan that once installed, uses your computers resource for crypto mining. If using windows, run a windows defender scan and delete it asap
I’m thinking of buying the Asus ROG Swift PG27AQDM OLED monitor. It specifies having “hdmi2.0 version”. But i recently came across a comment recommending using a hdmi2.1 cable for it. Was wondering if there would be a difference and if so it’s worth getting a 2.1 cable? Amazing video btw, straight to the point :))
I am an user of hp dockstation, it's a nightmare for me, when I am working, I suddenly lost my monitor signal in my pc but my coleages too, I prefer use HDMI, because I don't have this problem although use a cheap HDMI cable
@@techindex1 no, there was no technical issue, just display port being display port and not recognizing the monitors. Eventually it worked but it took hours of plugging and unplugging
As of this writing Dell is offering Display Port 1.4. As with TV's, the higher you go on resolution the less you will be able to utilize it. Standard broadcast signals DO NOT support higher resolution so spending the money on fancy graphics is a waste because you'll never see those higher resolutions. Unless your into gaming and money is not the object your eyes will hardly see the difference. Some may argue they can, but the ability to receive super high resolution is limited by device and signal. Be happy with a lesser standard until a higher standard is adopted mainstream.
Thanks for explaining the difference. Not a lot to choose from at the thrift store tech shelves. Across the pond we are amazed how you folks pronounce the letter “H”. Not sure where you got that…
I read an interview with George Lucas when asked when Star Wars would be hitting high definition he was waiting for the format wars to end but did state that he was siding with Bluray to come out the victor, so yeah based on that I was going with Bluray too.
Microsoft was too US-centric. Yes, they were too late to the market in the US. However, they did achieve some momentum with Windows Phone in other markets like Germany where it achieved somewhere between 10-15%. Their US-focus prevented them from recognizing this and building on the markets where they were performing well/better
I'm from Germany and loved my two Windows Phones, especially for the Xbox integration and after the the 8.1 Update which was far to late. If there where only some more apps available I'd might use them till this day.
my best phone ever was a Lumia 950. The active tiles were the best feature. Perfect while traveling, especially with displaying boarding info, without opening the app. How long do we had to wait for something like that in iOS? 🤔
Allso iirc HDMI supports an ethernet connection so it( if the feater is implemented in every device, can cut down a bit in cab,e clutter in your av setup without sacrificing the added stability of a wiered network connection
thank you so much for this video its very enlightening....well i have a problem with my kinda cheap 4k@60hz tv it feels like its locked on 30hz but it says 60hz ,,when gaming staying still make the picture look really clear and nice but once u start moving around it get really blurry and my tv it doesn't have any option for turning off motion blur or motion smoothing or soap opera effect donno exactly what they call it,,it only has a option for noise reduction and when i turn it off it kinda helps but still blurry when moving especially when playing soccer i tried every option on this tv but no luck....... My question is there any external device i can use like edid emulator or displayport to make the picture look clear thx in advance.
You’re welcome! It could be that the game you’re playing doesn’t support 60fps on console if it’s an older game. I’d also check the settings of your console to see if you have an option for 60fps/ 60hz. Some games you can choose in the settings menu whether you want quality mode (usually 4k 30fps) or performance mode (1080p/1440p 60fps) so it’d be worth checking your settings there. Some TV’s have picture modes, choose gaming if you have the option. If it’s been set to movie mode this will often limit the refresh rate to give it a more film like look (a lot of films are shot at 24fps). If all of this fails I’d probably speak to the manufacturer next and see if they can advise a setting change that may help as there may be one hidden within the menus that isn’t obvious.
Had a Lumia 640 back in 2015. Started out on Windows Phone 8 and upgraded to Windows Phone 10. Best phone OS ever - light years ahead of Apple/Androids boring square icon grid aesthetic.