The RPiCluster

2,552,655
0
2013-05-17に共有
Documentation, Source code, and EagleCAD designs: bitbucket.org/jkiepert/rpicluster

Summary:
The RPiCluster is a 33 node Beowulf cluster built using Raspberry Pis (RPis). During my dissertation work at Boise State University I had need of a cluster to run a distributed simulation I've been developing. The RPiCluster is the result. Each of the 33 RPi is overclocked to 1GHz and is running Arch Linux. This demo shows the RPiCluster running a parallel program I developed using MPI to control all of the RGB LEDs installed on each of the nodes.

The Whole Story:
The RPiCluster project was started in Spring 2013 in response to a need during my PhD dissertation research. My research is currently focused on developing a novel data sharing system for wireless sensor networks to facilitate in-network collaborative processing of sensor data. In the process of developing this system it became clear that perhaps the most expedient way to test many of the ideas was to create a distributed simulation rather than developing directly on the final target embedded hardware. Thus, I began developing a distributed simulation in which each simulation node would behave like a wireless sensor node (along with inherent communications limitations), and as such, interact with all other simulation nodes within a LAN. This approach provided true asynchronous behavior and actual network communication between nodes which enabled better emulation of real wireless sensor network behavior.

So, why I would want to build a Beowulf cluster using Raspberry Pis? The Raspberry Pi has a relatively slow CPU by modern standards. It has limited RAM, slow USB-based 10/100 Ethernet, and its operating system runs directly on a SD card. None of these "features" are ideal for a cluster computer! Well, there are several reasons. First, when your dissertation work requires the use of a cluster it is nice to ensure that there is one available all the time. Second, RPis provide a unique feature in that they have external low-level hardware interfaces for embedded systems use, such as I2 C, SPI, UART, and GPIO. This is very useful to electrical engineers (like myself) requiring testing of embedded hardware on a large scale. Third, having user-only access to a cluster (which is the case for most student-accessible systems) is fine if the cluster has all the necessary tools installed. If not however, you must then work with the cluster administrator to get things working. Thus, by building my own cluster I could directly outfit it with anything I might need. Finally, RPis are cheap! The RPi platform has to be one of the cheapest ways to create a cluster of 32 nodes. The cost for an RPi with an 8GB SD card is ~$45. For comparison, each node in one of the clusters available to students here at BSU, was about $1,250. So, for not much more than the price of one PC-based node, I could create a 32 node Raspberry Pi cluster!

Update: While the BeagleBone Black was not available when I started this project, I would have chosen it rather than the Raspberry Pi had it been available. It is the same cost once you include an SD card, but it has onboard 2GB of flash storage for the operating system. It also uses a Cortex-A8 ARM processor running at 1GHz.

Cluster Performance:
I measured basic computing performance in a number of ways (see the paper). MPI performance was measured using HPL (www.netlib.org/benchmark/hpl/) with ATLAS (www.netlib.org/atlas/). The RPiCluster achieved 10+ GFLOPS peak, with 32-nodes running HPL. The single 3.1GHz Xeon E3-1225 (quad-core) system, I used for comparison, showed about 40 GFLOPS peak (when the HPL problem was optimized for Xeon system).

When I run the HPL problem that achieves 10 GFLOPS on the RPiCluster, the Xeon system achieves about 2 GFLOPS. This is because the HPL problem size is so large that it causes paging on the Xeon system. The Xeon system has 8GB of RAM (~6GB usable after OS, etc) whereas the RPiCluster has about 16GB of RAM (~15GB usable after OS, etc).

More information: coen.boisestate.edu/ece/raspberry-pi/

Update:
I finished my PhD Spring 2014. For those interested in further details on what I was doing, an electronic copy of my dissertation is available here: scholarworks.boisestate.edu/td/807

コメント (21)
  • Now I know why I was having a hard time finding a Raspberry Pi in stock.
  • @walter0bz
    I accidentally clicked 2 rpi2 purchases; maybe this guy made a similar mistake but on a bigger scale
  • Watching this exactly 10 years after the video published. Still a badass setup to look at.
  • @cpace123
    Why are there so many people who just have negative things to say.  Every hobby does not have to make money or even sense. There is the experience, the learning, and just fun.  I mean it's like people who put extra lights on their cars. It's not my thing, and I would not use my money there, but it they are having fun who's place is it to tell them it's wrong.  Well as long as it's not hurting anyone.  So come people. Be kind. Or at least constructive.  
  • Turn this into a RuneScape bot farm and make the Venezuelan economy crumble even more
  • For anyone out there confused about what this cluster of computers are doing, he's basically simulating a real world problem where you have several computers/devices wirelessly communicating data to the same database at the same time, but in different instances. This explanation doesn't do it justice, but this is it in Layamon's terms. Read the description for the real one ;)
  • Now i don't need to imagine a beowulf cluster made of raspberry pi anymore. Thanks.
  • @tbbw
    I realy like the 5v feed you use. saves alot of extra wires and generaly made your setup look realy clean.
  • The controller for the lighting is more powerful than the cluster itself.
  • @kd8gby
    As an Electrical/Computer Engineer... I can say that this is one impressive feat! Well done sir! I'd love to see some benchmarks on this little bugger.
  • I like that you do both send/receive as well as broadcast messages, clever idea!
  • I love the power solution! I was trying to imagine all those micro USB power connectors, but applying power to the header and adding your own safety fuse (and programmable LED indicator) was completely clever. What became of the cluster after your research? Does anyone still try experimental loads on it? Are there practical, economic workloads for it?
  • @Disthron
    It looks really cool but I'm wondering what you are simulating on it.
  • This project is done very professionally. This Josh Kiepert dude is a shoe in for almost any HPC job out there - he probably already has one
  • Amazing job! I wish my dissertation 10 years ago involved something as neat as this! 
  • @Aaronage1
    Very cool, hope you got top marks on your dissertation  I'd love to see a build like this with ODROID U3. The U3 is a new $59 board with a 1.7GHz Exynos 441x Cortex A9 and 2GB   Wish I had the money to spare, would be a fun project :)
  • Upgrade those to Pi 3's!!! Then you'd be killin it!
  • This is very impressive, I wish I had the know-how to do this. It's exciting to think of the possibilities of all this.
  • That was so cool. I'd love to read through your programming