George Rebane
The Singularity must really be near when we have MIT physicist Max Tegmark climb aboard with the spate of science and industry luminaries who have recently discovered that intelligent machines will someday surpass humans, will systemically displace human labor, and (in a TBD form) will become the dominant lifeform on Earth. Dr Tegmark has just published his epiphany in Life 3.0, and joins Stephen Hawking, Bill Gates, Elon Musk, Nick Bostrom, … in spreading the news of the incipient Singularity to the general public. Of course, none of them dare call it Singularity; for if they did, then it would instantly identify them as Johnny-come-latelies.
For those of us in the field, as recorded herein, the possibility of Singularity has been known for at least thirty years (and more decades in science fiction). And science savvy entrepreneurs like Ray Kurzweil (now at Google) and Jeff Hawkins (founder of Numenta the developer of hierarchical temporal memory) have long staked out businesses that are pioneering the only branch of computer science that promises to deliver the Singularity – systems that are capable of learning from vast amounts of data present in their ‘environments’. Here I don’t want to revisit the ‘learn vs program’ debate on the path to super-intelligence other than to say that the ‘program school’ of computer science has woefully been the dragged anchor, diverting resources from the ‘learn school’ which now finally dominates the field of AI. All of the above people and technologies have been introduced and discussed here over the years (see RR’s Singularity Signposts section).
Deep learning is the label given to the latest generation of massive artificial neural networks (ANNs) that have learned to do amazing feats requiring intelligence and cognitive processing. Most of these feats already far surpass humans’ ability to do the same tasks. And what is happening today in the field of deep learning will literally blow your mind, in the sense of blowing it away as redundant, in the event that you decide to compete with it in the workplace – this brings to mind the story of John Henry, the steel drivin’ man (here).
Companies large – e.g. Alphabet/Google, Microsoft, Amazon, Facebook, Baidu, Alibaba, … - and small are racing to integrate deep learned AI into their operations and products at breakneck speed. The heads-up companies are launching in-house education and training programs to introduce deep learning to their technical and management staffs. As an example, I can modestly cite my computer scientist son-in-law Roland Fernandez at Microsoft Research who has co-developed and operates that company’s online course in deep learning (q.v.).
What the sclerotic (‘programming school’) side of computer science did not realize for some decades, with some still in their benighted darkness, is the message Dr Kurzweil has been telling (preaching?) to people for years – that technology is advancing exponentially. The programmers could not conceive that very soon there would emerge computers and databases that are large and fast enough to implement ANNs with thousands of layers that learn to ‘instantly’ manipulate millions of parameters in tasks like voice understanding, image recognition, medical diagnosis, concurrent large cohort control, and on and on. And that’s just today.
For more on Professor Tegmark’s new book, I bid you read the excellent review ‘When Machines Run Amok’ by Frank Rose.
Who han't seen 'The Terminator?" What will the evil corporations do next to make life miserable for people just to make a buck? Why robots of course. They don't take breaks, eat, sleep, get sick, or have children. Robot slaves, once you buy one they work for free and you can always trade one in on a newer model.. a lot like donald trump and wives.
Posted by: Robert Cross | 29 August 2017 at 05:05 PM
RobertC 505pm - And what, pray, would a non-evil corporation do in this AI environment to make life pleasurable for people and make a buck?
BTW, that was a clever citation of Trump's divorce, since there have been no progressives who have ever traded in their spouses on newer models. You have literally taken the wind out of the conservatives' sails here.
Posted by: George Rebane | 29 August 2017 at 05:48 PM
BURN! @ 548 ;-)
Posted by: Don Bessee | 29 August 2017 at 05:57 PM
It sounds to me like R Cross had best start breaking up those Jacquard looms, much less a pick and place machine or modern welding setup in a car plant.
Honestly it shows a real lack of imagination to think about serious AI in terms of replacing factory workers or stockers in a grocery store. In the scheme of things, those are trivial changes.
Posted by: ScenesFromTheApocalypse | 29 August 2017 at 06:35 PM
Microsoft unveils Project Brainwave neural network platform
Microsoft has officially unveiled a new deep-learning acceleration programme, Project Brainwave, which it claims allows for real-time artificial intelligence with ultra-low latency.
Part of Microsoft's increasing efforts in the deep-learning and machine intelligence fields, Project Brainwave was formally unveiled at the Hot Chips conference late last night. The work of what engineer Doug Burger describes in his announcement post as the result of work by a 'cross-Microsoft team,' Project Brainwave is effectively a three-layer platform: A compiler and runtime engine allowing for rapid deployment of trained models, a hardware platform based around field-programmable gate arrays (FPGAs) which acts as a deep neural network (DNN) engine, and a fully distributed system architecture.
The result, Burger claims, is a major increase in performance and the ability to treat neural networks as hardware microservices - services called directly by a server with no intervening software layer. 'This system architecture both reduces latency,' Burger explains, 'since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them.
'We designed the system for real-time AI, which means the system processes requests as fast as it receives them, with ultra-low latency,' claims Burger. 'Real-time AI is becoming increasingly important as cloud infrastructures process live data streams, whether they be search queries, videos, sensor streams, or interactions with users.'
http://www.bit-tech.net/news/tech/cpus/microsoft-unveils-project-brainwave-neural-network-platform/1/
My son-in-law Darrin Jones just joined Doug Burger's organization at Microsoft as Senior Director, Hardware Business Development - Cloud Server Infrastructure. Realtime AI is coming to the Microsoft cloud and will be available to multiple users, including Robert Cross if he had an application for deep-learning. My guess it will be a steep learning curve for Robert, he does not appear to be a fast learner.
Posted by: Russ | 29 August 2017 at 08:35 PM
Congratulations to your son-in-law Russ. This seems to be one of those times where picking the right technical specialty has a real pay-off.
You guys who are more technically minded can correct me (I might get the words or the ideas wrong), but this is an interesting time watching the platforms evolve. 50 years of interlocking ideas like 'C'/Unix have driven hardware design for some time, and they still seem to be feeling their way with evolving mixes of ASIC/CPU/FPGA/cpu on FPGA/memory on both/etc. etc. plus the issue of leveraging legacy software.
I wonder how long it'll be until a generation of AI platform is designed by the prior one, that's when you know the idea is getting real traction. It could be that the difference between hardware and software becomes more blurred as it was just a distinction of convenience.
Posted by: ScenesFromTheApocalypse | 30 August 2017 at 05:46 AM
[email protected]:46
The advantage of field programmable gate arrays is speed. Using field programmable gate arrays engineers can design a processor for a specific deep-learning task that takes a millisecond to perform rather than seconds a general processor might take. If the learning task changes, the engineering can reconfigure the field programmable gate arrays, tailoring the processor to the new task. This rapid redesign to task has enormous potential that Microsoft plans to exploit across its product line. They are already using FPGA in speeding up Bing searches.
Posted by: Russ | 30 August 2017 at 07:12 AM
Scenes 546am - To a large extent the current platforms have already been designed by the 'prior ones'. For example, no human has done the physical design of microchips for over twenty years - the compact, heat-shedding placement of the tens of millions of micro-components and connecting conductors is just too voluminous and complex for humans to do in any reasonable time frame. Now design algorithms are capable of laying out entire multi-component boards starting with functional specifications written in versions of so-called universal markup languages.
Posted by: George Rebane | 30 August 2017 at 07:40 AM
My main point is that there is sort of a 'do-over' in process. It's fairly obvious that large CPU design has been driven by a family of computer languages and typically by Unix and Unix-like OS's. Language and compiler design will tend to be bent towards CPU and overall system design, while the CPU/system will be bent towards existing language and OS needs. None of this is writ in stone (witness VLIW, DSP, and more recently GPU), but the cost of rethinking was always too high.
"To a large extent the current platforms have already been designed by the 'prior ones'."
As a tool, yes, but the aim has tracked along prior design and been locked down in in many ways. It's interesting to think of the ways that existing art is filled with anthropomorphisms or at least notions easily understood by mere people. 'Files', 'bytes', 'directories', 'virtual memory', the normal split (sometimes expressed throughout the design) between program and memory.
I'll be interested to see truly novel designs, but that will take a while I think. Something like Hillis / TMC simply didn't have the commercial potential that we are seeing now. Who was to know that the brave new world would be funded by internet advertising needs?
I think what I'm most after is not only the grunt work of design, but the design itself, and (most importantly) the reason for the design to be done by non-humans.
Posted by: ScenesFromTheApocalypse | 30 August 2017 at 08:21 AM
Scenes 8221am - And my 740am point was that such 'out of the box' new designs will come to be as machines are turned loose with just specifications that define performance functionality. (For example, the von Neuman architecture has served us well, but is not destined to rule much longer.)
Posted by: George Rebane | 30 August 2017 at 10:15 AM
Russ and Scenes, Russ' son in law's education and experience is apparently in bean counting and business development, so it might be a stretch to pin a technical prowess upon him. For son-in-law earnings potential, it does bode well for Russ' daughter having a pot to pee in going forward.
George, I recall being told the last big processor to be hand taped up was the Z8000, circa '78. I can't imagine the headaches experienced by that team.
Scenes, I recall Motorola sales engineers describing the MC68000 as the first processor optimized for C compilers. Of course large processors are optimized for the environments they are expected to be used in, including Un*x systems of all flavors.
Yes, FPGA's are neat, but I do miss the good ol' days of schematic capture for their design. Before I went over to the dark side.
Posted by: Gregory | 30 August 2017 at 12:29 PM
"What the sclerotic (‘programming school’) side of computer science did not realize for some decades, with some still in their benighted darkness, is the message Dr Kurzweil has been telling (preaching?) to people for years – that technology is advancing exponentially."
One has to love the tunnel vision of the true believers of a change that cannot be denied. To infinity and beyond! Whether it's catastrophic global warming or the Singularity, both groups have similar beliefs and tactics. I fully expect the Singularity to hit like the catastrophes before it... with a dull thud. Wait until it hits to do any victory dances, please.
Is that all there is?
https://www.youtube.com/watch?v=qe9kKf7SHco
Posted by: Gregory | 30 August 2017 at 12:45 PM
"Of course large processors are optimized for the environments they are expected to be used in, including Un*x systems of all flavors."
Absolutely. The reverse being true also, so you end up in an echo chamber of design. I admit that I'll be more a believer in stronger AI as soon as the hardware really reflects it (which I probably should have stated explicitly), although there's always the chance that this whole go-round will have lower marginal improvements in value going forward no matter what. I admit that I had never thought of 68k as being oriented towards C any more than something like x86, and with Vic Poor being dead it might be hard to find out.
Admittedly, there's plenty of HW/SW horsepower right now to change the world in a number of ways, but that's the boring part of all of this. From a non-practitioner's viewpoint, the difference between superior image recognition/self driving cars/more automated factories/blahblahblah and something really important will be a machine run design feedback loop, probably with it's own goals.
Posted by: ScenesFromTheApocalypse | 30 August 2017 at 01:54 PM
A conversation that could not be held by the denizens of the FUE's catbox!
Posted by: fish | 30 August 2017 at 02:19 PM
Scenes
No, the x86 instruction wasn't as orthogonal in addressing modes for really good C performance, but compiler writers were clever nonetheless.
fish
No, the FUE's denizens right now are singing the praises of Heidi Hall, the woman with a fine Pomona College education who can think you under the table and not be stumbling drunk no matter how much vodka she swims in.
"I have seen Heidi Hall at several events where alcohol was served and I have never seen her drink to excess or become sloppy. Given that she is a lady in the true sense of the word, this mistake was certainly an anomaly and I believe her when she says it won’t happen again." -Judith Lowry
Sorry, Judy, but for her to walk out and get into a car with a .22 means she's probably drinking to excess on a regular (think daily) basis. No that that's necessarily a bad thing.
Posted by: Gregory | 30 August 2017 at 02:56 PM
Posted by: Gregory | 30 August 2017 at 02:56 PM
Uh oh…..that little remark is sure to raise the ire of the FUE…….once he's done posting about how neat it is to get his locally sourced coffee beans with his and his wifes name personalized on the bag!
Posted by: fish | 30 August 2017 at 03:05 PM
Swilling a .22 might seem a concern to MADD! Are they in the tank for Heidi?
Posted by: Todd Juvinall | 30 August 2017 at 04:03 PM
Gentlemen, focus on the topic please.
Posted by: George Rebane | 30 August 2017 at 05:04 PM