All Wellington’s IT happenings underpinned by Multicore
All Wellington’s IT happenings underpinned by multicore
Press Release -Friday, 15 February 2013
Multicore World 2013
Among the host of web and IT-oriented events that Wellington hosts, many people don’t realise that their entire underpinning is multicore says Nicolas Erdody.
“This many cores on one chip architecture is what enables all of these consumer-centric activities, and conferences for that matter, to look and feel as good as they do,” says the Multicore World 2013 founder. It takes place at the Wellington Town Hall on February 19 & 20.
Wellington annually hosts events such as AnimFX, Webstock, NetHui, KiwiCon and the Web Developers Conference.
“Without the power of modern computers and servers, or the parallel programming that runs them, none of the products, services and developments shown off at these conferences would look half as impressive as they do.
“Multicore computing really is what makes such spectacular web front-end showcases tick.
“What we in New Zealand should realise is that by linking our ingenuity to multicore, as well as high-speed internet, we have an opportunity to expand and diversify our economy.”
Erdody says that the software development required for multicore computers hasn’t kept pace with the hardware.
New Zealand could, if it thought and acted strategically, position itself as a niche participant in creating solutions for global businesses he says.
These opportunities at both a NZ Inc and individual level will be part of the two-day event which boasts a world-leading gathering of multicore experts.
The wider debate of what is required to build multicore-oriented competence and services out of New Zealand will also be discussed. “There’s no other forum in Australasia that addresses this key component for our IT future,” Erdody says.
Among Multicore World 2013 expert speakers are IBM’s Paul McKenney, Intel’s Tim Mattson , Prof Ian Foster of Argonne National Laboratory and FreeBSD’s Poul-Henning Kamp.
Registrations are available at Multicore World 2013
Nicolas Erdody, Director Open Parallel. Nicolas.email@example.com (027 521 4020)
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented. As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the "big data" originating out of social networks and millions of intelligent data recording devices attached to the internet..
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington's Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio - telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two world's together.