« Male/Female Brains – different? | Main | Democrats, as they see themselves (updated) »

29 August 2017

Comments

Robert Cross

Who han't seen 'The Terminator?" What will the evil corporations do next to make life miserable for people just to make a buck? Why robots of course. They don't take breaks, eat, sleep, get sick, or have children. Robot slaves, once you buy one they work for free and you can always trade one in on a newer model.. a lot like donald trump and wives.

George Rebane

RobertC 505pm - And what, pray, would a non-evil corporation do in this AI environment to make life pleasurable for people and make a buck?

BTW, that was a clever citation of Trump's divorce, since there have been no progressives who have ever traded in their spouses on newer models. You have literally taken the wind out of the conservatives' sails here.

Don Bessee

BURN! @ 548 ;-)

ScenesFromTheApocalypse

It sounds to me like R Cross had best start breaking up those Jacquard looms, much less a pick and place machine or modern welding setup in a car plant.

Honestly it shows a real lack of imagination to think about serious AI in terms of replacing factory workers or stockers in a grocery store. In the scheme of things, those are trivial changes.

Russ

Microsoft unveils Project Brainwave neural network platform

Microsoft has officially unveiled a new deep-learning acceleration programme, Project Brainwave, which it claims allows for real-time artificial intelligence with ultra-low latency.

Part of Microsoft's increasing efforts in the deep-learning and machine intelligence fields, Project Brainwave was formally unveiled at the Hot Chips conference late last night. The work of what engineer Doug Burger describes in his announcement post as the result of work by a 'cross-Microsoft team,' Project Brainwave is effectively a three-layer platform: A compiler and runtime engine allowing for rapid deployment of trained models, a hardware platform based around field-programmable gate arrays (FPGAs) which acts as a deep neural network (DNN) engine, and a fully distributed system architecture.

The result, Burger claims, is a major increase in performance and the ability to treat neural networks as hardware microservices - services called directly by a server with no intervening software layer. 'This system architecture both reduces latency,' Burger explains, 'since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them.

'We designed the system for real-time AI, which means the system processes requests as fast as it receives them, with ultra-low latency,' claims Burger. 'Real-time AI is becoming increasingly important as cloud infrastructures process live data streams, whether they be search queries, videos, sensor streams, or interactions with users.'

http://www.bit-tech.net/news/tech/cpus/microsoft-unveils-project-brainwave-neural-network-platform/1/

My son-in-law Darrin Jones just joined Doug Burger's organization at Microsoft as Senior Director, Hardware Business Development - Cloud Server Infrastructure. Realtime AI is coming to the Microsoft cloud and will be available to multiple users, including Robert Cross if he had an application for deep-learning. My guess it will be a steep learning curve for Robert, he does not appear to be a fast learner.

ScenesFromTheApocalypse

Congratulations to your son-in-law Russ. This seems to be one of those times where picking the right technical specialty has a real pay-off.

You guys who are more technically minded can correct me (I might get the words or the ideas wrong), but this is an interesting time watching the platforms evolve. 50 years of interlocking ideas like 'C'/Unix have driven hardware design for some time, and they still seem to be feeling their way with evolving mixes of ASIC/CPU/FPGA/cpu on FPGA/memory on both/etc. etc. plus the issue of leveraging legacy software.

I wonder how long it'll be until a generation of AI platform is designed by the prior one, that's when you know the idea is getting real traction. It could be that the difference between hardware and software becomes more blurred as it was just a distinction of convenience.

Russ

[email protected]:46

The advantage of field programmable gate arrays is speed. Using field programmable gate arrays engineers can design a processor for a specific deep-learning task that takes a millisecond to perform rather than seconds a general processor might take. If the learning task changes, the engineering can reconfigure the field programmable gate arrays, tailoring the processor to the new task. This rapid redesign to task has enormous potential that Microsoft plans to exploit across its product line. They are already using FPGA in speeding up Bing searches.

George Rebane

Scenes 546am - To a large extent the current platforms have already been designed by the 'prior ones'. For example, no human has done the physical design of microchips for over twenty years - the compact, heat-shedding placement of the tens of millions of micro-components and connecting conductors is just too voluminous and complex for humans to do in any reasonable time frame. Now design algorithms are capable of laying out entire multi-component boards starting with functional specifications written in versions of so-called universal markup languages.

ScenesFromTheApocalypse

My main point is that there is sort of a 'do-over' in process. It's fairly obvious that large CPU design has been driven by a family of computer languages and typically by Unix and Unix-like OS's. Language and compiler design will tend to be bent towards CPU and overall system design, while the CPU/system will be bent towards existing language and OS needs. None of this is writ in stone (witness VLIW, DSP, and more recently GPU), but the cost of rethinking was always too high.

"To a large extent the current platforms have already been designed by the 'prior ones'."

As a tool, yes, but the aim has tracked along prior design and been locked down in in many ways. It's interesting to think of the ways that existing art is filled with anthropomorphisms or at least notions easily understood by mere people. 'Files', 'bytes', 'directories', 'virtual memory', the normal split (sometimes expressed throughout the design) between program and memory.

I'll be interested to see truly novel designs, but that will take a while I think. Something like Hillis / TMC simply didn't have the commercial potential that we are seeing now. Who was to know that the brave new world would be funded by internet advertising needs?

I think what I'm most after is not only the grunt work of design, but the design itself, and (most importantly) the reason for the design to be done by non-humans.

George Rebane

Scenes 8221am - And my 740am point was that such 'out of the box' new designs will come to be as machines are turned loose with just specifications that define performance functionality. (For example, the von Neuman architecture has served us well, but is not destined to rule much longer.)

Gregory

Russ and Scenes, Russ' son in law's education and experience is apparently in bean counting and business development, so it might be a stretch to pin a technical prowess upon him. For son-in-law earnings potential, it does bode well for Russ' daughter having a pot to pee in going forward.

George, I recall being told the last big processor to be hand taped up was the Z8000, circa '78. I can't imagine the headaches experienced by that team.

Scenes, I recall Motorola sales engineers describing the MC68000 as the first processor optimized for C compilers. Of course large processors are optimized for the environments they are expected to be used in, including Un*x systems of all flavors.


Yes, FPGA's are neat, but I do miss the good ol' days of schematic capture for their design. Before I went over to the dark side.

Gregory

"What the sclerotic (‘programming school’) side of computer science did not realize for some decades, with some still in their benighted darkness, is the message Dr Kurzweil has been telling (preaching?) to people for years – that technology is advancing exponentially."

One has to love the tunnel vision of the true believers of a change that cannot be denied. To infinity and beyond! Whether it's catastrophic global warming or the Singularity, both groups have similar beliefs and tactics. I fully expect the Singularity to hit like the catastrophes before it... with a dull thud. Wait until it hits to do any victory dances, please.

Is that all there is?
https://www.youtube.com/watch?v=qe9kKf7SHco

ScenesFromTheApocalypse

"Of course large processors are optimized for the environments they are expected to be used in, including Un*x systems of all flavors."

Absolutely. The reverse being true also, so you end up in an echo chamber of design. I admit that I'll be more a believer in stronger AI as soon as the hardware really reflects it (which I probably should have stated explicitly), although there's always the chance that this whole go-round will have lower marginal improvements in value going forward no matter what. I admit that I had never thought of 68k as being oriented towards C any more than something like x86, and with Vic Poor being dead it might be hard to find out.

Admittedly, there's plenty of HW/SW horsepower right now to change the world in a number of ways, but that's the boring part of all of this. From a non-practitioner's viewpoint, the difference between superior image recognition/self driving cars/more automated factories/blahblahblah and something really important will be a machine run design feedback loop, probably with it's own goals.

fish

A conversation that could not be held by the denizens of the FUE's catbox!

Gregory

Scenes
No, the x86 instruction wasn't as orthogonal in addressing modes for really good C performance, but compiler writers were clever nonetheless.

fish
No, the FUE's denizens right now are singing the praises of Heidi Hall, the woman with a fine Pomona College education who can think you under the table and not be stumbling drunk no matter how much vodka she swims in.

"I have seen Heidi Hall at several events where alcohol was served and I have never seen her drink to excess or become sloppy. Given that she is a lady in the true sense of the word, this mistake was certainly an anomaly and I believe her when she says it won’t happen again." -Judith Lowry

Sorry, Judy, but for her to walk out and get into a car with a .22 means she's probably drinking to excess on a regular (think daily) basis. No that that's necessarily a bad thing.

fish

Posted by: Gregory | 30 August 2017 at 02:56 PM


Uh oh…..that little remark is sure to raise the ire of the FUE…….once he's done posting about how neat it is to get his locally sourced coffee beans with his and his wifes name personalized on the bag!

Todd Juvinall

Swilling a .22 might seem a concern to MADD! Are they in the tank for Heidi?

George Rebane

Gentlemen, focus on the topic please.

The comments to this entry are closed.