Research

Case studies

Big Data And Big Challenges At The Core Of Computer Science


Dr Hans Vandierendonck, High Performance and Distributed Computing, School of Electronics, Electrical Engineering and Computer Science

The expansion of teaching and research in Computer Science is a priority for Queen's. It can be seen in the redevelopment of its home at the Bernard Crossland Building where a modern new structure is being created and it can be seen in the appointment of new academics like Dr Hans Vandierendonck who is opening up research in a fast-developing field.

A degree in Computer Engineering from Ghent University in Belgium led to a PhD which focused on Computer Architecture, ‘the branch of computer science which studies how to design and organise the hardware components of computing systems.’

He spent eight years as a research fellow at the Research Foundation Flanders working on high performance computing. During this time he also travelled to Crete for a collaboration at FORTH – the Foundation for Research and Technology Hellas – with Professor Dimitrios Nikolopoulos who would later join the School of Electronics, Electrical Engineering and Computer Science at Queen’s.

‘That’s how I got to know about Queen’s and when an opening as a lecturer came up I applied for it.’

These are rapidly changing times in computer science, as Hans explains. ‘Computers became faster and faster but there were physical restraints on how to cope with this. So round about 2004 multi-core computers were introduced.

‘Now if you buy a laptop, or even a mobile phone, it doesn’t have one CPU – central processing unit – it has two or perhaps four. It’s like working in a restaurant where instead of one chef preparing a dish you have several working on it.

‘But software that’s been written for multi-core CPUs is much less reliable, more buggy and much harder to fix than the software written for a single CPU. So you have to solve the problem – how can I write software that uses multiple cores but doesn’t pose any significant problems for the programmer? That’s the programme I started to work on in my postdoc years and it’s what I’m working on today at Queen’s.’

In addition, a research project through an ERC Marie Curie Fellowship, and with support from the international software company SAP in Belfast, relates to big data and how to manage it. It involves a new type of technology – nonvolatile memory.

‘Vast amounts of data are being amassed and need vast amounts of energy. Google, Facebook, Twitter, BT – most companies have data centres using tens of megawatts each. In fact, it’s estimated that the data centres around the world are consuming more power than the whole of Italy.

‘Companies currently store these big datasets within the main memory of the computer. Traditional DRAM-based memories consume a lot of energy because they forget quickly and the data must be rewritten before it disappears. But this new technology uses very little energy. Reading and writing takes up energy but once you store it, it remains there, using almost none.’

Another project involving big data is being funded by the EC’s Data Value Chain and is a collaboration with the National Technical University of Athens, FORTH, the University of Geneva and the Italian telecoms company Wind.

‘Wind are giving us a chunk of data from customer records to be analysed. One of the things they have in mind is shared car rides. They can identify people living in the same area and travelling to the same place so they want to give them the opportunity to subscribe to a service that shares a car.

‘We’re living in a digital world. The gigantic amount of data being shipped around the internet is growing at an exponential rate, doubling about every year. This is hard to sustain. There’s nothing in nature that grows at such a pace.’