Explaining Edge Computing

Explaining Edge Computing


Welcome to another video from ExplainingComputers.com. This time I’m going to talk about edge computing. This places networked computing resources as close as possible to where data is created. As we’ll see, edge computing is associated
with the Internet of Things, with mesh networks, and with the application of small computing devices like these. So, let’s go and delve more deeply into
computing on the network edge. To understand edge computing we need to reflect
on the rise of the cloud. In recent years, cloud computing has been
one of the biggest digital trends, and involves the delivery of computing resources over the Internet. In the early days, most of the devices that
accessed cloud services were PCs and other end-user hardware. But increasingly, devices accessing cloud
services are also Internet of things or IoT appliances that transmit data for analysis
online. Connecting cameras and other sensors to the
Internet facilitates the creation of smart factories and smart homes. However, transmitting an increasing volume of data for remote, centralized processing is becoming problematic. Not least, transmitting video from online
cameras to cloud-based vision recognition services can overload available network capacity and result in a slow speed of response. And this is the reason for the rise of edge
computing. Edge computing allows devices that would have relied on the cloud to process some of their own data. So, for example, a networked camera may perform
local vision recognition. This can improve latency — or the time taken
to generate a response from a data input — as well as reducing the cost and requirement
for mass data transmission. Staying with our previous example, let’s
consider more deeply the application of artificial neural networks for vision recognition. Today, Amazon, Google, IBM and Microsoft all
offer cloud vision recognition services that can receive a still image or video feed and
return a cognitive response. These cloud AI services rely on neural networks that have been pre-trained on data center servers When an input is received, then they perform
inference — again on a cloud data center server — to determine what the camera is
looking at. Alternatively, in an edge computing scenario, a neural network is usually still trained on a data centre server, as training requires
a lot of computational power. So, for example, a neural network for use
in a factory may be shown images of correctly produced and then defective products so that
it can learn to distinguish between the two. But once training is complete, a copy of the
neural network is deployed to a networked camera connected to edge computing hardware. This allows it to identify defective products
without transmitting any video over the network. Latency is therefore improved and the demands on the network are decreased, as data only has to be reported back when defective products are identified. This scenario of training a neural network
centrally and deploying copies for execution at the edge has amazing potential. Here I’ve indicated how it could be used
in vision recognition. But the same concept is equally applicable
for the edge processing of audio, sensor data, and the local control of robots or other cyber physical systems. In fact, edge hardware can be useful in any
scenario where the roll-out of local computing power at the extremities of a network can
reduce reliance on the cloud. One of the challenges of both the Internet
of Things, and of edge computing, is providing an adequate network connection to a vast number
of cameras, sensors and other devices. Today, the majority of devices connected wirelessly to a local network communicate directly with a WiFi router. However, an alternative model is to create
a mesh network in which all individual nodes dynamically interconnect on an ad-hoc basis to facilitate data exchange. Consider, for example, the placement of moisture and temperature sensors in a large industrial greenhouse. If all of these devices have to have direct
wired or wireless connectivity, then a lot of infrastructure would need to be put in
place. But if the sensors can be connected to edge computing devices that can establish a mesh network, then only one wired or wireless connection to the local network may be required. Edge computing hardware is defined by its
location, not its size, and so some edge devices may be very powerful local servers. But this said, a lot of edge computing is destined to take place on small devices, such as single board computers. Here, for example, we have a LattePanda Alpha and a UDOO BOLT, both of which could be deployed to process data at the edge. Other potential edge devices include the Edge-V from Khadas as we can see here — this has even got “edge” in its name — and it’s
got multiple camera connectors, which is very useful for edge applications. And then over here we have a Jetson Nano SoM, a system-on-a-module, and this is a particularly interesting single board computer because
it’s got a 128 CUDA core GPU. So it’s very good for vision recognition
processing at the edge. Another slightly different and very interesting device is this, the Intel Neural Compute Stick 2, or NCS2. This features a Movidius Myriad X vision processing unit, or VPU, and it’s a development kit for prototyping AI edge applications. And if I take off the end here you’ll see
this is a cap, and this is actually a USB device. And the idea is you can plug this into a single board computer, such as a Raspberry Pi, in order to significantly increase the capability of a small board like a Raspberry Pi to run edge applications like vision recognition. The exact definition of edge computing remains
a little blurry. This said, all major players agree that it
places networked computing resources as close as possible to where data is created. To provide you with some more extensive definitions, IBM note that “Edge computing is an important emerging paradigm that can expand your operating model by virtualizing your cloud beyond a data center or cloud computing center. Edge computing moves application workloads from a centralized location to remote locations, such as factory floors, warehouses, distribution centers, retail stores, transportation centers, and more”. Similarly, the Open Glossary of Edge Computing
from the Linux Foundation defines edge computing as “The delivery of computing capabilities
to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services. By shortening the distances between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today’s Internet, ushering in new classes
of applications”. Cisco have also introduced the term “fog
computing”, which it describes as “. . .a standard that defines how edge computing should work, and [which] facilitates the operation of compute, storage and networking services between end devices and cloud computing data centers”. What this means is that fog computing refers to resources that lie close to the metaphorical ground, or between the edges of a network
and the remote cloud. It may be, for example, that in a factory
some edge sensors communicate with local fog resources, which in turn communicate as necessary with a cloud data center. It should be noted that the term “fog computing” is mainly used by Cisco, and is viewed by some as a marketing term rather than an entirely distinct paradigm to edge computing. Edge computing is emerging for two reasons. The first is the rising pressure on network
capacity. While the second is our growing demand to obtain a faster and faster response from AI and related applications. As a result, while for a decade we’ve been
pushing computing power out to the cloud, increasingly we’re also be pushing it in
the opposite direction to the local extremities of our networks. More information on a wide range of computing developments — including AI, blockchain and quantum computing — can be found here on the ExplainingComputers YouTube channel. But now that’s it for another video. If you’ve enjoyed what you’ve seen here
please press that like button. If you haven’t subscribed, please subscribe. And I hope to talk to you again very soon.

Daniel Ostrander

Related Posts

100 thoughts on “Explaining Edge Computing

  1. Samyojeet Dey says:

    Hi I have a request can you make a video on Deepin OS please? The link below shows that it is better, but I want your expertise and advice in this.
    https://www.forbes.com/sites/jasonevangelho/2018/12/10/meet-the-linux-desktop-that-blows-away-windows-10-and-macos/amp/

  2. Funky Monkey says:

    Good idea..have all your data stored on somebody else's hardware then have all your appliances hooked into the interweb so {{{they}}} can spy on you

  3. Eduardo Alvarez says:

    So millennials discovered local cache

  4. Ron Lewenberg says:

    Given the increasing processing power in phone and tablet system on chips, most notably the newest iPhones and iPad Pro, would you expect to see Edge like features move over to these devices as well? This would allow for a lower latency, and I could see a new system where there is localized learning.
    And do you think that in your own that works or chips for these will be built into Intel and AMD chipsets in the near future? It seems to me that the PC is being left behind.

    PS. There already are implementations for active directory in Windows Server 2016 and 2019, which creates an edge like paradigm. The active directory for managing all of the computers, devices, and accounts is centralized in a cloud-based server with the local servers being copies. In some cases these local servers can an accountant update data, but not others they are simple copies. Either way, there's regular synchronisation at set times.

  5. An Kaz says:

    9:00
    Is "cloud" not still essentially a marketing term?
    I never stopped seeing it that way. Just meaning "web service," be it storage, processing, hosting, software, etc.
    It's humorous thinking companies decided to make the "thing" in their IoT less useless to save on bandwith costs.
    Edge computing as a concept is somewhat understandable, but to the end-user it may aswell be a device that half-works on its own but is still dependent on the net and phoning-home, at least with the AI example given.

  6. Jamie Whitehorn says:

    Another great video of just the right length. No waffle, no banter, just the information you need to understand the concept.

    It makes me wonder, is this process cyclic? 50 years ago compute power was expensive so processing was centralised with mainframes and dumb terminals. Then PCs change the landscape with cheap processing and we de-centralised. Then we upped the amount of processing we needed to handle the vast amounts of data we are now collecting so the Cloud was born and we centralised again. Now we've got dedicated devices like VPUs and pre-trained neural nets that can offload the processing and a limited resource of bandwidth, so we're decentralising again. I wonder if we "fix" the bandwidth problem will we centralise again …😀

  7. cicada says:

    way to make the video 10 minutes bro. this 10 minute thing is getting ridiculous and makes me not want to watch YouTube anymore.

  8. Saskia van Houtert says:

    It's obvious that computers can be related to eachother and can take notice what happens in the envoriment, thanks for showing and kind regards.

  9. Chrysippus says:

    Q. How to make the cloud more reliable?
    A. Rely less on the cloud.

  10. Paul Milligan says:

    Thanks for another excellent tutorial. I could never work out how training your Raspberry Pi to tell the difference between an orange and a banana was going to change the world.

  11. Pqrst Zxerty says:

    Video example – but, of storing data live in Cloud for cctv is also so robbers can not alter the data. Thats why people use Youtube live for protection.

  12. DK Stoney says:

    In the future, you'll no longer have a name. You'll just be a numbered data node and valued only for how many unique data sets you can provide.

  13. yumri4 says:

    So edge computing is them realizing, again, that local compute is better than remote server compute. It basically is a cycle and we are on the side of the cycle that is moving back to local computer from remote compute.
    I agree edge computing is important and interesting to that the data sets needed for some of this are to large to set on the local device. Having to go to the cloud many times is a bad solution so i am happy a solution has been found for that. Just the data set has grown to big for several applications of use. Mostly recognition software that has to ID someone with a name instead of a true or false answer.
    Factory products can be programmed into a central server then compared against however many times needed for factory QA but facial recognition is a no go for locally found.
    Temperatures can be done almost all locally with 1 server connecting to all the up to 254 nodes … or more if you want to use multiple network subset address spaces and the software allows for that.

  14. Idylchatter says:

    59 down votes from Windows 3.1 users.

  15. johnalmason says:

    You have to laugh at the definitions of Edge Computing given by IBM and the Linux Foundation from 7:23 onwards. Basically, what they're saying is: 'we sold you the idea of Cloud Computing, and like a fool you bought it. But you've maxed out your bandwidth doing it – and we've reached the limit of what we can take, store and process in the Cloud (i.e. our Datacentres). So now we're pushing the storage and processing workload back to you (you know, like you did in the first place) and we're going to charge you for the privilege by selling you a load of new devices to do it with'. And – yet again – we'll fall for it.

  16. Allen Johnson says:

    I believe we will soon have truly distributed computing in the sense that all devices will be mesh networked and process shared workloads. Small jobs may only use one system or a small group of systems, while larger jobs will spread out further, like SETI. The big cloud companies will have to make some changes in their business model to continue being profitable.

  17. Gábor Kimmel says:

    Never understood this categorization. In the very beginning, there were computers. Then someone thought about using one computer from different places simultaneously, and made the terminals. Then computers became cheaper so everyone got their own, capable computers. Than stuff happened and we needed giant server clusters (at the end of the day, the "cloud" is just that, servers) for computing rather than the flimsy hardware in smartphones and IoT devices. Now some asshat coins the term "edge computing", because we are not using someone else's computing power from afar, instead we use powerful hardware onsite for this. See the pattern?

  18. Siphesihle Khoza says:

    If we have ideas for new episodes, where can we share them, Chris?

  19. Conor Hanley says:

    Seems a very nebulous and foggy buzzword which amounts to computing on one's own network not someone else's cloud with AI ,perhaps. Nothing to see here , move along.

  20. Varde1234 says:

    You give me an Owl vibe.

  21. Computer Tricks says:

    Thank for share. Ansome video

  22. Bob says:

    I thought this video was about edging… boy was I wrong

  23. Joseph Clark says:

    So edge is the opposite of cloud.

  24. Ross Knowles says:

    it's good that there is a reliability issue… it'll slow the terminators down

  25. giorgio gava says:

    absolutely interesting for a neophyte like me…thank you.

  26. Tcll5850 says:

    well at least now I know what I should stay away from is called
    a local offline service is better than any cloud

  27. Deses says:

    So, let me get this straight.
    We used to NOT have a cloud and do everything locally, then they sold us the cloud as the best thing ever and now they want us to move again to the old ways with a fancy new name and a price markup, I assume.

  28. Franken Berry says:

    Many have commented on the "Circle of Computing" that seems to happen where we centralize then decentralize the stuff doing the number crunching. While there are marketing force trying to capitalize on the trend I think its the result of a natural feedback loop.

    We have a range of possible solutions.

    At one extreme all computing and data is done/kept locally. At the other extreme it is processed and stored centrally. There are various technological and economic constraints.

    At one point the only type of computers were big and expensive. This led to a centralized solution. Along comes relatively cheap minicomputers local to a single building or floor which helped to mitigate the expensive telecommunications costs and delays caused by centralized development teams.

    Then came inexpensive PCs and and LANs. Computing/data was pushed out onto individual desktops. Time rolls on and low and behold it turns out that all those highly specialized and trained people who had been talking about keeping backups, properly describing your requirements and other "we don't need to do that but actually do" stuff were actually right.

    The costs of telecommunications went down and there is a problem with local design teams only thinking of their local problems leading to individual silos creating multiple incompatible solutions for the same problem, none of which easily talk to each other nor give the higher levels of the organization what they need. The pendulum swings and we are back to a more centralized solution with the newly name cloud which looks an awful lot like the networked mainframe computing that has been chugging along with big iron for several score years.

    We are now starting to see that solution giving way since the hammer ( central computing) requires a bigger and larger number of nails ( communications speed and volume).

    Basically either solution has pluses and minuses, a yin and yang if you will. If one is the "solution de jour" it gets over used and the other starts to look better, hence the feed back loop comment.

  29. Gwe Keren says:

    Human QA will be replace by computer QA. RIP QA engineer…

  30. Elvira Elora Milosic says:

    Edge computing is a logical step.
    Simplest way to unburden the internet.
    NN, ML, huge amount of data…, but as well powerful SBCs, faster machines and greater storage makes edge computing an expected step.
    But more important, much more possibilities to experiment with, on individual level, affordable technology with power.
    Citius, altius, fortius.

    Kind of'reversing', before cloud computing era. 🤔

    Excellent video! 👌🏻✨
    Cheerio Chris! 👋🏻

  31. Meinfred says:

    "Single Board Computer" deceptively sounds like "Civil War Computer".

  32. berighteous says:

    so it's moving cloud computing away from the cloud…like we used to do it before the cloud. So, like we knew the whole time, the cloud is useless. Thanks.

  33. jameswalker199 says:

    I think edge computing could be very helpful for individuals, and not just for enterprise users. I believe it was in the comments of one of your videos – or perhaps the video itself – that someone talked about the notion of a databox that would use a pretrained neural network to take in all the data from devices in your home and either make inferences by itself or convert that data to something more acceptable to send to a larger datacentre. The reasoning behind it is that the company that makes your smart teapot doesn't need to know the Unix timestamp of when you put the kettle on if it only wants to know how many cups of tea you have in a day, so the databox would do all the logging and send a revised average of cups of tea per day to the datacentre at the end of each week.

    The cool thing about it is that your entire house could automate a lot of things while being offline; e.g. your smart shower would tell the databox that its just been turned on, and the databox would infer that you don't want to step out of the warm shower into a cold room, so it turns the heating up a little. The databox also uses information from your smart security system to know what your usual route is after you've taken a shower, and draws your smart blinds so people can't see you walking to the bedroom, and all of that without ever touching a computer that's not in your house, or sending data to a device that doesn't need to know.

  34. Scottius Nevious says:

    What came to my mind was my highschool network server. It connected all the computers together so you can log onto your account on any computer in the school.

  35. Niels Daemen says:

    I have been watching this channel for 10 years now, and this guy hasn't changed a bit! I love it!

  36. Kruemmelbande The Cat says:

    Can you explain Internet explorer computing

  37. Robin Lovell says:

    It’s not a cloud – it’s just someone else’s computer

  38. Don Basta says:

    Your explanation of cloud and edge computing reminds me of the old client server model that evolved into cooperative computing where the client PC would process the data before transmitting to the server. Lots of new terminology.

  39. Scott Watschke says:

    Very well done video, informative.

  40. joyela aeuvunya says:

    so, like a saturation of devices that are inbetween each other in the connections….? sort of more computers to fill in the gaps in the world where there aren't any computers or networks, creates a stronger network? it makes sense, like the spider spinning more layers into the web 🙂

  41. Bando LyriX says:

    This is a good one…thanks

  42. Sean Bergstedt says:

    I believe edge computing is just a reference to explicit data graph execution (EDGE). I'm not sure if the edge computing brand actually implement processors that are build with the EDGE instruction set and corresponding data flow processor, but I think it's along that vein of thinking. Maybe it's simulated/virtualized? Anyway, that's my take after having researched the TRIPs architecture and data-flow model.

  43. Justyn says:

    We've come full circle. Mainframes (cloud) to microcomputers (not cloud) to the Internet (cloud) to fog computing (not cloud).

  44. Stacey Bright says:

    Wait are neural networks basically designing ASICs, in the context of edge computing?

  45. OPhotoVideo.Com says:

    YOUR ARE MY ROCK STAR

  46. Iron Black says:

    I hate Macs

  47. Larry Webber says:

    Thanks Chris for a very informative and well done video.

  48. LMacNeill says:

    What is old is new again. The '50s, '60s, and '70s were a time of nothing but cloud computing — you had large mainframe computers (the 'cloud' of the day) attached to dumb terminals so people could access these computing resources.

    Then the 1980s and 1990s came along and the PC revolution happened — people had their own PCs that weren't connected to any networks at all, or to a very basic file-sharing system at most. All computing was done locally.

    Then the '00s and '10s — the cloud came back in the form of the Internet. People still did local computing, of course, but offloaded more and more of their computing and storage to the cloud, as our devices became smaller and, interestingly enough, less powerful, because we traded power for mobility.

    Now, as we reach the '20s and beyond, we go back to "local" computing resources, pulling away from the cloud and doing more computing on our own hardware — relying less and less on remote hardware.

    It's fun to watch the pendulum swing back and forth.

  49. Technical Giboss says:

    AI with Cloud computing simultaneously is big industry changer.

  50. DoubleIrish DutchSandwich says:

    Anybody else suspicious about whether the host is a computer himself?

  51. Kenneth McMillion says:

    https://www.kickstarter.com/projects/teamiot/artificial-intelligence-neural-computing-powered-by-intel

  52. Aussie Gamer says:

    Haha, let them have 'The Fog'. 'Cloud' was considered silly at first, but we got used to it in the end.

  53. Steve Tattersall says:

    An interesting video thanks Chris. Are you planning on showing some actual edge computing demonstrations on some SBCs? I'd like to see some more.

  54. Joanna Kleinheksel-Horn says:

    Thank you! That was very informative 😊

  55. JonnyInfinite says:

    This sounds basically like a thin client but with AI. I'm assuming it's more to do with how Edge systems interface with cloud and local.

    The latency thing interests me from a cloud gaming point of view. Google's bullsh*t 'negative latency' is guaranteed to lead to Stadia to flop with hardcore gamers, but the option for AI that can handle aspects locally wouldn't be as much of an issue in bridging the gap between full cloud gaming (which will always suffer latency) and local processing (which will always be limited to the hardware available).

    In this way you can make use of say, powerful local gaming hardware from a graphical standpoint, but use cloud servers to offload AI-centric aspects through an Edge type interface…

  56. Milosz Ostrow says:

    Nothing has changed in the last 50 years: It's the continuing battle of those who want to centralize computing under their control (mainframes) and those who want to decentralize computing (personal computers).

  57. Juan Mulford says:

    Probably the best explanation on YouTube. Subscribed.

  58. marc carter says:

    Thanks, Christopher! Watching the programs you make, in particular, the ones on your AMD build, gave me the confidence to build my own AMD system. I love learning from you, please keep teaching.

  59. Michael Kuhn says:

    Wow back to distributed processing. That's new..

  60. Hunter's Moon says:

    So even computers are into edging.

  61. Zacharia Cassim says:

    Proud Nottingam Business School alumni. I remember your lectures on cloud computing eons before it was mainstream.

  62. Spuds says:

    Defective?? Sounds like the Rise of the Daleks from Dr. Who or the BORG from Star Trek!!!🤔🤔🤔🤔

  63. Thomas Cott says:

    Hi Chris, nice video. Currently, I feel that Edge, Cloud and Fog computing are not in my future. Interesting video though.

    On a second note, going back to the SCSI situation I mentioned to you before, I had a thought that I want to share with you. What would you think of slapping a SCSI card ( I'm told that they still exist ) in a desktop and then network it so that you could park it in your basement, or closet and talk to it via your Pi? Let me know what you think. Cheers.

  64. Felony Videos says:

    If the technological singularity is not implemented in our phones themselves, we will lose the future to evil giants like Google. We need a distributed AI, not a single point of machine intelligence.

  65. Durrpadil says:

    Cloud <- Fog <- Swamp jk 😂

  66. Summersault says:

    I thought edge computing was a Microsoft browser based OS

  67. Dr mosfet says:

    New names for the same old things only difference is CPU, GPU, TPU horsepower. Sun microsystems company's motto "The Network Is The Computer" RIP 2010. Neural chips not new ether Intel was making them in the 1990's they just didn't catch on, not enough horsepower. But good to know what the latest terminology is.

  68. Vista Zeng says:

    I have one of that exact computing stick but I have no idea how to use it. It seems that in the facial recognition scenario, my PC can compute faster.

  69. Gaelio Bauduin says:

    You kinda look like Christopher Barnat

  70. Jass&OtherStuff says:

    The world has come a long way since I had to periodically had to empty chad from the paper tape punch.

  71. Shaun Grace says:

    If you ever look at the natural number spiral, no two spiral nodes ever overlap on a line if line drawn straight out from centre. What saying is you can send so much data in one optic path/line you'll still have room to send more when finished

    How, in practical terms, is have natural number spiral cone and each node sends data upon same path/line. The spacings will never overlap. Obviously, each node must wake up its fellow partner node/receptor at other end

    Bit of geometry needed but it should work..if resonances of partner nodes at each end know when a pressure spike is coming/going or not, for its bit of territory

    Hope makes sense first read. If not you'll work it into a viable form

  72. I H says:

    You're futzing with a terminal

  73. ProCactus says:

    If edge computing is a result of cloud computing. Then it's all bullshit. Cloud computing should be avoided, therefore edge computing won't exist. Also if all you need is a network of some kind then all PC's are edge computers. The term edge computing is just as stupid as trying to solve a problem with cloud computing without trying otherwise. Cloud computing only exist to harvest your data. Too much harvesting going on these days. How can it be trusted.

  74. Manfred Smith says:

    The reason we have to move away from cloud computing is that we need all the bandwidth for watching all the youtube videos from explainingcomputers.com! 😁

  75. SugarBeetMC says:

    One thing you didn't mention is that edge computing potentially gives the users back some control over their data and the devices processing it. No longer would all data be sent into an ominous "cloud" (which is a marketing term for "somebody else's computer") but be able to be processed under the oversight of the user creating it.
    Other than that gripe, that was an excellent overview!

  76. TechBaron, Cameras and more! says:

    Bono has left the chat

  77. Nexodus ZeroX says:

    Edge Computing is realistically just a term that anime weebs use to distinguish themselves form the crowd. You playing Doki Doki literature club and as such.

  78. Isul Machdar says:

    Introducing FROG computing!

    Wrekk! 😂

  79. Jonathan Nunez says:

    What kind of edge Computing neural network would I use to have a machine learn to do a task. Like a drone flying, or a mechanical dog learning to walk

  80. J Neilson says:

    Chris sounds like he's just been to the dentist and the novocaine hasn't quite worn off yet.

  81. Tony Blackwell says:

    Thanks for the detailed explanation

  82. sailaab says:

    so how close might we be¿ in merging voice requests and such "edge" enabled surveillance cameras to allow conditional access to areas.. without one having to shove their face within 5 centimetres?

    outside of research labs or defence use, if it exists.. then probably iyam unawares.

    it should be easier to have the Fedex person or mailman set important parcels, items carefully inside the living room
    or
    for the maharaaj (cook) to get access to the kitchen and dining area .. while keeping the bedrooms and other areas out of bounds.

  83. Loch Ness Biker says:

    Hey Chris. Just been re-watching the 100th Video. Yeh, Mini Disc. What a GR8 format. I still use mine on a daily basis. I have copied my many hundreds of ( irreplaceable ) Cassettes onto MD … B4 they WEAR OUT ! ! … ( I am an Audiophile ) How about an updated ‘’desk tour / my PC’s’’ video, in the same style as your 100th video content. GR8 channel. I have been Sub’d since your ‘first upload’. ( How the world of Poota’s has changed, eh ! ) How many video’s now ? Keep ‘em coming. ATB

  84. Pac Tube says:

    I know that video is newer than my face but I feel like its from 2000s

  85. Inu Yasha says:

    OW THE EDGE!

  86. Marco Ferreira says:

    Join #TeamTrees at TeamTrees.org

  87. Alfa101 says:

    My definition of edge computing would be "[on-premises] level-1 cache of the cloud computing"

  88. Naeem says:

    Amazing video

  89. Naeem says:

    Sir, you are amazing you explain in a very simple way so every one can understand what you are saying.
    Love from Pakistan

  90. Marc Parsons says:

    Excellent video! The information is relayed clearly and concisely. Before watching this I never knew anything about Edge Computing. Now I do. Result! Thank you

  91. Sovereign Knight says:

    Excellent!

  92. Unsteady Eddy says:

    Correct me if I'm wrong but I understood this as; in the eighties you'd have computers in the factory or office because there was no other choice. The internet was in it's infancy and there was no "cloud" to offload large amounts of data into. Instead there would be a mainframe computer (which didn't have to deal with as much data back then, anyway). Then throughout the nineties and noughties we developed the internet and as our appetite for big data grew, we humankind built massive network interchanges and data centers full of server racks, so it became possible to "outsource" a lot of intensive number-crunching to "the cloud". However, the sheer amount of data being transferred and processed by society and the rapidly expanding "internet of things" now threatens to overload the bandwidth of even these massive, modern data centers. Therefore some of the less critical and routine computing is being put back on the factory floor for smaller "edge computers" to deal with, whilst essential data (about problems or significant results etc) is dealt with by the larger computers and/or sent to the cloud for processing in order to increase efficiency.

  93. bjetpilot says:

    It’s just Silicon Valley’s Richards dream; computing power spanned across devices. With 5G implementation that last node can handle the data throughput.

  94. Graph Guy says:

    EMP > EC
    The world will collapse because of its reliance of the cloud when EMP(s) wipe out the internet.

  95. Fit Man says:

    Mostly just jargons

  96. jimmyturpin says:

    I remember back in the 90's when Scott McNealy said "the network would be the computer", (or something very similar) and I guess now that everybody has figured out it costs a lot of money to build and maintain the centralized "cloud", those "cloud owners" are trying to force that computing resource (and costs) back on the users, (or should they now be called "edge users"?) to lower their own cloud costs. Sound a lot to me that we have come full circle.

  97. Nitinder Mohan says:

    I have been researching on edge clouds since 2014 and I commend (and approve) your work in this video. Not even many researchers have figured that "Fog" is a marketing term by CISCO to push their products. Edge, Fog, Mist, Mobile Edge, Mobile Access Edge, etc. etc. Names may be many but you hit the nail right on its head with the basic explanation of the concept. Job well done!

  98. 녹색조각 says:

    thx for great explain 🙂

  99. Soulless Pitch says:

    Do you have any idea how many miniature cameras that I can hook up to this
    Khadas edge-v

Leave a Reply

Your email address will not be published. Required fields are marked *