Yeah, WiFi is a standard that is made to make wireless simpler to implement and use. That's what you should tell your family. Sure it can lead to becoming an engineer, but it's meant to let everyone use wireless.
If you implement QoS rules and enforce DNS redirecting and thus block a few websites like porn and such and use MAC instead of password for authentication maybe you're in the path of becoming a network technician. If you install another router to watch the packets exchanged between the original router and the client then you're definitely in the path of becoming a network hacker. But an engineer? Nah.
Some times is the opposite, when people ask me "oh what did you study?" and I answer "Computer Engineering", "oh so you can help me install my printer?"
Fun fact. The IBM 1130 was simulated on an IBM 360 to determine how the single card bootstrap loader would work. The 12 rows of the standard punch card were expanded to the 16 bits of the 1130's words and the instructions were designed to make it work. OS, compilers, assemblers, peripherals, everything was simulated first.
5:11 I think there's an error with the Moore's law graph. The scale of the vertical axis is already logarithmic, but the graph still looks like an exponential on a linear scale would. "Doubling every x years" should look like a straight line.
Aw, no mention of CrashCourse Computer Science? They e.g. covered a lot of the electrical-engineering-y hardware basics on the lowest levels, so have a look if your interest was sparked by this episode.
The 3% of energy produced in the world being used by computers was interesting. It would be even more interesting to know what percent is being used for doing cryptocurrency calculations.
boss : i plan to build a software. my friend says it's a good business. tech director : my friend says building a software could takes too long, unexpected result, overpay risk, unstable dependencies, costly long term support, and yet too much competition. boss : how about talking about building a software ?
Another thing to consider with energy efficiency is _what_ is running on the computer as a well written C++ will be arguably far faster and more efficient than a well written Java or Node program as the C++ program traverses fewer layers of abstraction and is often optimised for specific hardware.
Adnan A it will, it’ll work in other forms e.g. how many nanotubes can we fit on a transistor. But Kurzweil covered this issue in his law of accelerating returns.
Logic gates may become nano scale mechanical switches rather than transistors. See Eric Drexler's book Engines of Creation: The Coming Era of Nanotechnology.
Although CS and CE require the same amount of mathematics on paper, the engineering classes use these mathematics a lot more frequently. I would say that CE is not the major to fall back onto if your mathematics is becoming difficult, but rather CS is the one to fall back to.
6:39 eh, thats very misleading? components are getting more powerful, and general memory requirements are increasing... but they're also getting significantly more efficient... either getting more performance out of the same power targets, reducing overall power usage, or a mixture of both. There's also Instruction Per Clock improvements where more tasks can be completed in a single clock cycle. And just about all new internal components will downclock or even enter ultra low power sleep modes when not in use to save more energy when you don't need the power, this is a major reason why we have laptops getting 10+hours of usage on a charge and mobile devices that can last days. Even the ultra-high-end is more efficient now than it has ever been despite there being multiple times more processing cores.
that's true about them getting more efficient and powerful recently, but what we have to remember that that efficiency and power comes as a result of the innovative engineering that chip designers are using and, in some cases, even inventing. so although they are more powerful and efficient, all of that wouldn't be possible using the methods found in chip architecture even only 5 years ago. reiterating the point of this series, it's to show that with new engineering techniques and new ways of designing systems, we can achieve all of this, but not without the challenges that come before success. ^^
Yes, but with diminishing returns. Leakage in the smaller transistors will counteract some of the power savings due to the smaller size. Also the smaller size results in increases in the power density - more power per unit area. That makes it much harder to cool the circuits. Why have computer chip speeds pretty much settled into the approximately 2-4 GHz range? It's because it is so hard to cool them if run at faster clock rates.
An issue you approach but don't quite mention explicitly is dark silocon. The chips are getting smaller but the concentrated heat prevents the whole chip from running at once. If we did run power through the whole chip it would break so we need to leave some of the chip "dark". This leaves less of a benefit from jamming more transistors on the chip. One way around this is as was said, reduce the heat. Another idea has been to design hardware "accelerators" that can do one job very well, and just specialize different parts of the chip to do different roles. This way we maintain some benifits from shrinking the chip while not requiring the whole chip to be running.
There is a great difference between software and hardware engineers that's not even touched in this. It's telling which divisions you guys have chosen to focus on.
As computer chips become more complex I'm sure their price will sky rocket. Oftentimes when I diagnose a control board as the broken part my customers often choose to buy a new appliance instead of having me replace it. Sometimes the control boards cost almost as much as the appliance itself!
@Hernando Malinche Electrical Engineering is much harder, it requires more analytical math(Calculus 1,2,3, differential equations.......etc). While in Computer science if you got Calculus 1 and discrete mathematics then you are alright.
The hardware to pull off a general AI absolutely already exists. the next hurdle is in software and storage density and datasets, if they were at the right place we could absolutely build a facility (probably larger than any datacenter on earth, mind) for it to run.
I don't know about that. I'm a software engineer, and while I admit that I don't do a whole lot with machine learning in my work or even as a hobby, I'm of the belief that people overestimate the barriers to general AI. We're not there yet, but we're a lot closer than most people think... and don't even get me started on the people who believe there's something spiritual/magical/special about "minds" that can't be replicated in a computer. Because they're just delusional.
The quality of Crash Course Engineering series has really been falling with the numerous poor choices in words and misunderstandings of key concepts. At 6:12, it is very misleading to say that nanotechnology is an alternative to standard transistor methods. Instead, nanotechnology is a natural progression of electronic engineering which fits into the Moore's Law narrative where numerous advances in engineering solutions make up what is known as nanotechnology. TLDR: 6:12, Nanotechnology is not an alternative, it is the current standard.
Does anyone else feel that this series is severely lacking in terms of generally useful and relative information? No specific explanation of embedded systems in this episode is simply heartbreaking. In case it helps, I've always found Computer Engineering is best explained from the ground up. By which I mean from a Physics point of view to high level programming abstractions. For future episodes please pick an angle and stick with it throughout instead of jumping around so much because these episodes are not nearly long enough for such a broad scope.
Crashcourse, a suggestion or maybe a request but do you plan to create a video about agricultural engineering and am I correct that it encompasses all four pillars of engineering namely chemical, civil, electrical and mechanical?
I watched a video partially about reducing computer energy use on a power-sucking dual Xeon workstation from 2005. Am I a bad person? (Joking, although my basement system I watched this on is a dual Xeon system from 2005 and it indeed does suck power... ).
As for the End of Moore's Law: You assume it can only progress via scaling down transistor geometry, but there are other ways to achieve doubling number of transistors on a single chip.
@vvenomm 492 Not certain, however, perhaps there is some way to manufactur negative nanometer chips. I am no computer engineer though so don't take my word for it!
Nitpicking, but there is a difference between memory and storage. I wouldn't store a picture of my cat fitting into a tiny box on volatile memory. I would store it on a non-volatile HDD or SSD though 😉
hello i am an upcoming college students.. I am really in a tight spot right now as I am really having a problem on what course to pick Computer Engineering or I.T? But my first choice is to become a computer engineering but my problem is does computer engineering course has many job opportunities ? But one thing for sure there a lot of opportunities of Being an I.T here on my country but i really want to be a computer engineer ? and also can a computer engineering can world of I.T related Field? like even I am an a Computer Engineer can I work on I.T related stuff? Thanks.
I want my next CPU to be a quattuortrigintillion core quantum processor powered by interdimensional dark energy. But it'll probably just be a 16 core Intel :'(
@@erik-ic3tpHa ha. I would love to live long enough to see humans travel to other planets with life on. That's my biggest regret of being much nearer the end of my life than the beginnning.
Who wrote this? This is a terrible explanation. Theres a lot of details in this video that are tangential, obscure and insequential to someone who doesnt already understand these concepts. For those that already understand, they are too basic. Who was this written for? And she's also a terrible speaker for this. Sounds like shes babbling on and on, just one sentence after another. Crash course usually has good speakers. She is not.