Conference Pitches NZ As A Center For Multicore Excellence
Conference Pitches New Zealand As A Center For Multicore Excellence
Press Release - 12 December 2012
Enhancing New Zealand’s opportunity to position itself at the forefront of a revolutionary advance in computing is the objective of a second conference on Multicore computing to be held this coming February in Wellington.
The two day conference on February 19 & 20 at the Wellington Town Hall will build on Multicore World 2012 which also featured leading software and hardware speakers and attendees.
Multicore World is the brainchild of Open Parallel director Nicolas Erdody, who has also started up several international entrepreneurial businesses.
The conference will discuss cutting edge technologies already under development, in which multicore machines are capable of incredible tasks as long as the software is written to take advantage of their capabilities.
“We can foster an ecosystem of software communities, industry, academia and investors around this permanent change that affects every computer on the planet,” says Multicore World 2013 organiser Erdody.
“Our main conference goal is to provide IT decision-makers and software community leaders with knowledge and connections about the business and technology implications around multicore requirements over the coming years.
"Because parallel computing and multicore affects everyone, anyone with even half an interest in computers, smart devices and internet development should look to attend.”
The initial Multicore World 2013 programme is now available, and already a number of notable speakers have been confirmed for the conference.
Attendees are being encouraged to register early (and take advantage of the reduced early-bird booking fee) to avoid disappointment. The early-bird ticket offer which will save attendees $200 off the full ticket price ($950) will expire on January 14.
• Paul McKenney – distinguished
engineer and Linux CTO of IBM (US);
• Poul-Henning Kamp – chief architect of Varnish and author of FreeBSD (Denmark);
• Professor Ian Foster – director, Computation Institute, Argonne National Laboratory (USA)
Erdody says the conference and timing is perfect for New Zealand, as parallel computing and multicore are both mainstream and niche.
“If we hone our vision and position our software industry as being capable of responding to the worldwide industry’s needs for parallel programming, we have the national opportunity to reap the rewards of this new platform technology,” Erdody says.
“Multicore World 2013 is a means to achieve this vision.”
Nicolas Erdody, Director Open Parallel. Nicolas.firstname.lastname@example.org (027 521 4020)
Karen Bender, Business Growth Manager, Grow Wellington. Karen.email@example.com (021 628 144)
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented. As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the "big data" originating out of social networks and millions of intelligent data recording devices attached to the internet..
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington's Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio - telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two worlds together.