Those Without Knowledge Of Multicore To Be Left Behind
Those Without Knowledge Of Multicore Computing Doomed To Be Left Behind
Press Release - Wednesday, 9 January 2013
Multicore World 2013
Those people who are unaware of the rapid evolution of computing due to multicores are doomed to be left floundering in the slipstream of its possibilities.
The founder of Multicore World 2013, Nicolas Erdody, says the multicore computer has become the “new transistor”. “If you’re not up to speed, it will be extremely hard to catch up,” he says.
Multicore World 2013 is taking place at the Wellington Town Hall on February 19 & 20, and boasts an international line up of speakers who are authorities on computer architecture and software which allows parallel processing and massively increased computer, smartphone and other device performance.
“Computer chip designers are concentrating on how to gang together lots of cores,” Erdody says. “It will be similar but different to how the previous generation of microprocessor engineers thought about the circuitry they were creating at the level of individual transistors.”
Erdody says multicore computing changes everything, and that New Zealand’s IT community and users have the opportunity to get onboard now for the coming rapid evolution.
One challenge for multicore computing is that software development requiring parallel programming has not kept pace with the hardware manufacturing of computers with many processors in a single unit.
There is a huge challenge in writing programs that carry out different tasks, together, while also being aware of and react to the other tasks which may impact on what is trying to be achieved.
This hardware-software disconnect and possible solutions, along with leading edge thinking and developments are part of Multicore World 2013.
Erdody says the fact that the event has been able to attract speakers who would fill auditoriums in the northern hemisphere, is a good reason for Australasians to make a beeline to Wellington in mid-February.
“This is a first-rate presenter line-up; and having such well-informed multicore specialists on our doorstep is an excellent opportunity to get to the event,” Erdody says.
“Equally, the quality of conversations and business networking available because of Multicore World 2013 makes it a standout event.”
Nicolas Erdody, Director Open Parallel. Nicolas.firstname.lastname@example.org (027 521 4020)
Karen Bender, Business Growth Manager, Grow Wellington. Karen.email@example.com (021 628 144)
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented. As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the "big data" originating out of social networks and millions of intelligent data recording devices attached to the internet..
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington's Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio - telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two world's together.