Tuesday, December 7, 2010

Listen to what Microsoft has to say about using colors in user interfaces

I just stumbled upon a very interesting and important article on MSDN about colors and their use in user interfaces.

After the fundamentals of different colors spaces are explained, the article continues with design concepts and guidelines that are important to every software developer not exclusively programming on monochrome displays. It highlights the fact that a large portion of the male population has difficulties distinguishing colors and links to the publication Can Color-Blind Users See Your Site?, which describes this topic in more in-depth. But not only color blind people will get annoyed by software that uses colors that are hard to distinguish.

Another highlighted aspect are themes and the possibility for users to change their software visuals to their liking. Those wishes should be respected by every software developer and designer.

The primary goal of a user interface is to ease the user's perception of what is going on and/or how he can accomplish what he currently wants to do with your software. There are many different ways to make this as easy as possible, and icons and colors are a very important part of it. But there are many things you can do wrong when styling your application. You have to take factors like different display hardware and lighting conditions, people's unique color perception and individual themes into account.

Wednesday, August 18, 2010

What include guards in C++ are, and what they are not

And why they do not solve "multiple definition" errors.

Include guards are heavily used in C++ and you see them in virtually any code base (that has more than two files). Sometimes there is a bit of confusion what they do, and what they can not do. I decided to explain everything in detail, so I ended up explaining the whole C++ compilation process of preprocessing, compiling and linking your application or library.

Introduction to include guards

First, it's important to know that include guards are not a language feature. They are a technique that use standard preprocessor features to solve a common issue. Namely, they avoid that one header is included multiple times. Let's assume a simple example of two files: test.cpp and the corresponding test.hpp:





test.hpp:
class Test
{
   public:
      void foo();
};
test.cpp:

#include <test.hpp>
/* Some more includes */
#include <test.hpp> // Error! class Test is already defined

void Test::foo()
{
   // Do something here
}

int main()
{
   Test test;
   test.foo();
}

As you can see, test.hpp is included two times in this example. This might not be common in the same implementation file, but when the list of include files is very long, this might very well happen. Remember that an #include is basically the same as when you copy & pasted the content of the included file at this very position. Only that this is done for you by the preprocessor. The resulting code when you run test.cpp through a preprocessor will look like this:

test.cpp - preprocessed

class Test
{
   public:
      void foo();
};

class Test
{
   public:
      void foo();
};

void Test::foo()
{
   // Do something here
}

int main()
{
   Test test;
   test.foo();
}

It should be obvious now why the error occurs. You can't define two classes with the same name in the same translation unit. And that's where the include guards come to the rescue. The basic idea behind include guards is: Do something in the header that will make the preprocessor not "copy & paste" the content into the including implementation file a second time. Besides getting rid of duplicate declarations, this even speeds up compiling, since otherwise the compiler would have to process the same content twice. This goal is achieved by checking whether a certain preprocessor macro is defined. If not, the macro will be defined so that a second check will notice that it is already there, which means that the header is already included. Here's how it is be done:

#ifndef TEST_H
#define TEST_H

class Test
{
   public:
      void foo();
};

#endif

This is what happens, from the viewpoint of the preprocessor:

  • 1st Include of test.hpp
    • Is TEST_H defined? - no
    • Define TEST_H
    • Include content of test.hpp
  • 2nd Include of test.hpp
    • Is TEST_H defined? - yes
    • Skip to the #endif direction
    • Include everything in test.hpp after the #endif

      (In 99.9%, this should be empty)

This leads to the desired result for the preprocessed test.cpp, even when we include test.hpp twice:

test.cpp - preprocessed with include guards
class Test
{
   public:
      void foo();
};

// Here should be the second include, which has been avoided by the include guard

void Test::foo()
{
   // Do something here
}

int main()
{
   Test test;
   test.foo();
}

Compiler-specific directions, #pragma once

Since include guards are so common, some compiler vendors (read: Microsoft) decided to create a special directive for them. You might have come across the statement #pragma once while reading through code. This #pragma once is all three preprocessor-directives in one. It replaces the #ifndef, the #define and the #endif (where the #endif is always at the very end of the file). As this is a vendor-specific directive, not all compilers support it. One reason for this may be because it does not solve a problem that couldn't be solved otherwise. This is why you should obide the following, simple rule:

Don't use #pragma once

Because:

  • It's compiler-specific and is not portable
  • Because of this, not everyone reading your code might know it
  • It does not offer big advantages over the "classic" include guards.
(I said "big" advantages. As noted on the Wikipedia article for #pragma once, Visual C++ includes optimization code that makes headers using #pragma once be skipped faster than classic include guards. While this might be desirable, I can't imagine that in practice it will be of great benefit. And, after all, GCC includes optimization for classic include guards, too, which might render the speed improvements of #pragma once on Visual Studio minimal.)

The compile process: Preprocess, compile, link

To fully understand the issue at hand, you have to know how the whole compilation procedure for C++ works. It is divided into three steps.

The first step is the preprocessor. Everything that starts with a # sign is a preprocessor directive. The most common ones are #define, #ifdef or #ifndef, #endif and so on. The preprocessor is essential to C++ because it's the only pragmatic way to split up declarations and definitions and make the same function or class usable from multiple implementation files. The preprocessor is called upon each implementation files (to make this clear again: These are usually called .cpp). The implementation files include header files (and should never include other implementation files). All preprocessor directives in the header files will be processed, too, which means that you can #include other files in the header, #define macros and create include guards. The result of a preprocessed implementation file is a large file that includes every included header, and of course the header included of those headers, and so on. For headers that use libraries such as the standard library, boost, Qt or something like this, the result will often be huge.

The second step is the compilation. This huge mess - also called the translation unit - will now be run through the compiler. Yes, the compiler only compiles one file. And this is not the .cpp file you might have passed on the command line or that you added to the project. It will be a file generated from the preprocessor that does not even remotely resemble your original .cpp file (except for the very bottom part). Most noteworthy, you will not find a single preprocessor directive (like #define) in this resulting file, since they all have been already processed by the preprocessor. The result of the compilation is the object file. The object file is not human readable and does not include source code anymore. Instead, it includes so-called symbols and their content, which might be variables, constants, functions and virtual function tables. This is why you usually have one object-file in your compiler's working directory (this is the Release and Debug directories for Visual Studio. GCC puts the object-files right where your .cpp file is, if you do not specify a file explicitly) for each implementation file in your project. You don't have object files for headers, because they are not individually compiled. They are only "glued" on top of your implementation files. At this point it is important to know that when the compiler creates a function call, it does not include the absolute address of the function in the generated code. Instead, it includes the symbol of that function. In C, this simply is the function's name. In C++, it's the function's name mangled with some meta-information that depends on the compiler. It's very similar with global variables and virtual function tables. They are not referenced by an absolute address, either, but via their symbol name. This means that the object files created by the compiler do not include machine code that is ready to be executed. The symbols have to be replaced with their absolute addresses. And this is where the linker jumps in.

The third and last step is the linking. Just like the compiler, the linker does not know about header files. And it does not know about implementation files, either. It only knows the object files and the linker's job is to link all object files together to one executable or library, that can finally be interpreted by the hardware. It does this by pasting all object files into one, and whenever a symbol occurs (this might be in a virtual or static function call or when referencing a global variable, for example), it replaces this symbol with it's absolute address. That is why you sometimes get errors like "Symbol xyz is already defined in foo.obj". This means that you defined a function or variable in two different translation units, which will generate the same symbol. This is not allowed per One Definition Rule because the linker would not know which of them it should use.

Multiple definitions

There are two ways you can cause multiple definitions of the same symbol: You accidentally define the same global variable or function at two distinct places (e.g. one time in the header and a second time in the implementation file) or you include a header that defines a global variable or function in two separate translation units. The first issue is handled by the compiler. As this double definition is in but one translation unit, the compiler notices that the identifier in question is already declared (and probably defined) and bails out. Here is an example of a double definition of a function foo():

test.hpp
void foo() { } // Note the empty function body!

test.cpp

void foo()
{
   /* Do something */
}

This sometimes can happen when changing a function to be inline and forgetting to remove the original definition.

The second issue is a bit more tricky.
Remember that a translation unit is the implementation file prepended by all included headers. So we take the same header as before, defining an inline function void foo() { }. But this time, it is included from two files in the project, one named test.cpp and the other named test2.cpp:

test.hpp
void foo() { }

test.cpp
#include "test.hpp"

// The actual content of the program is irrelevant
int main ()
{
}

test2.cpp
#include "test.hpp"

// Here are some functions defined in test2.cpp

When you compile this project, you will get an error message along the lines of

multiple definition of `foo()'
. This is because both translation units define a symbol foo, which is prohibited by the holy One Definition Rule.

Include guards to the... rescue?

So, you might think .o( When the function may not be defined twice, let's wrap test.hpp in an include guard! ). Try it and see for yourself. It will not make a difference. This is because the preprocessor will be run for each implementation file separately, leading to two distinct translation units. When test.cpp is preprocessed, TEST_H will not be defined and the header's content will be included in the resulting translation unit. Then the resulting code is compiled, leading to a test.o or test.obj, without any error. There is only one declaration and definition of the function in this translation unit, after all. When the second implementation file is being preprocessed, the preprocessor starts over, which means that #defines made in the first translation unit will not be available in the second or any other preprocessing. This means that TEST_H is still not defined for test2.cpp, so foo()'s definition will be included a second time. It will now be compiled into test2.o or test2.obj. foo() will be defined in both, test.obj and test2.obj.

The final step is the linking, which should lead to an executable or machine-readable library. But the linker notices that foo() is defined two times, prints the aforementioned error message and aborts it's task. Because the linker does not even look at the source code that the object files were compiled from, it does not know where the double definitions occured. In fact, the linker even works without the source code being available.

This means that include guards do not solve multiple definition errors.

Friday, April 23, 2010

Samsung N150 Eliah Netbook review

I recently bought a white Samsung N150 netbook because I need it for my upcoming vacation in Japan. There are two target audiences for netbook users: Customers with a limited budget and customers that appreciate the small size and high portability of such devices. In my case, both was true. Since I've spent most of my money on my trip to Japan, I was not able to buy a laptop for like 600-700€ or something. Additionally, I wanted something small that would easily fit in my backpack, including a protectional case, and that does not weigh too much. With it's 10" display and a total size of 180x264mm (10.4"x7.4") and a weight (including battery) of 1240g (2.73 lbs), it fully meets my mobility expectations. You don't even notice the weight when carrying it in a backpack or bag.

Specs

Here are some additional specs: Like almost every netbook, the N150 features an Intel Atom at 1,6 GHz. It already uses the newer generation that has been introduced at the beginning of this year. The screen's resolution is 1024x600 pixels, displayed by an Intel GMA 3150 chipset. The resolution can be scaled to 1024x768 for applications that need this resolution, but it then obviously looks ugly. There is an external VGA plug that can output up to 2048x1536 pixels. Neat! I already tried to connect my HDTV and it perfectly works a Full HD 1920x1080 resolution. The N150 ships with Windows 7 Starter and a few drivers, utility software and games pre-installed. On the first boot, the installation is automatically being finalized, which took something like 2 hours, d'oh. In the setup process, you can choose the sizes of the two partitions that will be created. You have a total of 250GB harddisk available for your needs. I chose to have a 80GB system partition and left the rest for data storage. Because Windows 7 Starter only supports 1 GB of memory - you have to upgrade to Windows 7 Home Premium for something around 80€ when you want more - the netbook has only 1 GB of DDR2-800 installed. The memory can easily be upgraded, because there is a special removable cover to access the module. Another limitation in Windows 7 Starter is that you can't use the external monitor as an extended desktop. Only cloning of the image is allowed. The problem with not being able to change your desktop wallpaper on Windows 7 Starter can be circumvented using this trick.

Connectors

It comes with a total of 3 USB ports, two on the right and one on the left. The one on the left is the only one that can be used with devices that draw power from the USB port and is labeled with a small "power" symbol. You can enable and disable the power output on this port with a special utility that comes pre-installed. Microphone and external speakers or headphones can be connected via two separate plugs. The aforementioned VGA connector is on the right side, while the ethernet cable can be plugged in on the left side, next to the round, relatively small power connector. Only 10 and 100mbit ethernet are supported - so no gigabit ethernet on the netbook :-( The AC adapter works with 100 to 240 volts, making it perfectly suitable for my stay in Japan. Quite hidden is an SD-card slot beneath the touchpad with the typical, somewhat fiddly cover. It supports the SD, SDHC and MMC formats.

Internal peripheral

That's it for the connectors for possible external peripheral. There already are some nice hardware pieces built in. One of it is the mandatory 802.11 b/g/n WLAN, but Bluetooth is also included, which can be quite handy to transfer pictures from your cell phone when you don't have a suitable (and usually expensive) cable at hand. While you can plug in an external microphone, there is already one built in. It's located next to the touchpad, which was kind of a stupid decision because you easily cover it with your hand when using the touchpad. I used it to Skype with friends and they said that the quality was very good - which surprised me, because it is so tiny. The webcam has a resolution of 320x200 pixels and is very slow. It's not that much fun doing video-chat with it, because you hardly see your movements. A software from Cyberlink is included that can be used to toy with the recorded picture. But it's actually not that great, either. What really bothers me is that there is no hardware switch or other means to disable or cover the webcam. When you got malware installed that can remote-control your webcam, you will never notice. There is no indicator whether the webcam is active or not, either.

Keyboard and touchpad

Being very small, the keyboard obviously does not come with a standard layout. The keys Insert, Delete, Enter, Page up, Page down and the arrow keys are directly accessible. Home and End, on the other hand, are only accessible by pressing the Fn key that's located between the left Ctrl and Windows keys. This is actually the only thing that requires some time getting used to. The rest of the layout is nice and I could immediately touchtype on the keyboard. Too much typing leads to numb fingers (at least for me, of course), because the keys are pretty hard. The numblock keys are located on the regular character keys and can be activated by pressing Numlock (only availble via Fn key). When pressing Fn and the designated numblock keys, they have the same function as the deactivated numblock keys. The F-keys have special functions that can be used by pressing the Fn key:

  • Escape: Sleep
  • F2: Battery status
  • F3: Euro sign (€)
  • F4: Switch monitor modes (when an external monitor is attached)
  • F5: Switch backlight on/off
  • F6: Mute sound
  • F7: Samsung Support Center
  • F8:
  • F9: Disable/Enable WLAN
  • F10: Disable/Enable Touchpad
  • F11: Numlock on/off
  • F12: Scroll lock on/off
  • Insert: Pause
  • Directional up/down: Brightness
  • Directional left/right: Sound volumne
All in all, the keyboard is very usable, despite the Home and End keys and the mislocation of the <>| key.

The multi-touch touchpad has two buttons - no middle mouse button! Scrolling is done with a two-finger move, which sometimes just doesn't want to work, especially when not sitting directly in front of the keyboard. But most of the time scrolling works very neat. There's also a "three-finger-flick" movement available that's supposed to switch between tabs in browsers, but it doesn't work for me (using Google Chrome as my brother on the netbook). The touchpad's sensitivity can be configured and after configuring for my needs, it works flawlessly (well, most of the time, at least). I bought an external mouse that matched the colors of the netbook, but rarely use it - only when gaming. My overall impression of the touchpad is, keeping it's size constraints in mind, very good. And that's although this my first laptop and I'm not used to touchpads at all.

Software

I removed all pre-installed games before trying them, so I can't tell you whether they are fun or not. A 60-day trial of Microsoft Office 2007 and a full version of Microsoft Works is shipped with the N150, too. The Samsung Recovery Solution is pretty neat. You can easily back up your system partition to your data partition and restore it with only a few mouse-clicks. I think that Samsung Recovery Solution is a re-branded Acronis software. Since I own Acronis True Image Home, I've created a bootable USB-stick with Acronis that I use instead of the Samsung Recovery Solution, though. There are other Samsung tools for: Extending battery life, enable/disable the chargable USB port, manage the display, resolution and network settings and a software to update all Samsung tools. As with all other Windows 7 versions, Windows Live comes for free, which is not at all that bad. For example, you can use the Windows Live Movie Maker to convert movies to a lower resolution when playback is sloppy (which is the case with 720p material).

Working with the N150

When I bought this netbook, I thought that I'd have to live with a number of limitations due to the low-end hardware. I was wrong. The 1.6 GHz Atom with 1 GB of RAM actually performs much better than I had expected. I never ran into memory shortage, even when browsing with a number of open tabs, some of them containing a flash stream. Skype works fast and neat, too, although it warns me that my hardware was too slow. Video playback works great for standard definition videos. 720p videos are a little bit sloppy. It feels like only a few Hertz are missing to play them without stuttering. :-( Using VLC I get much better results than with Windows Media Player or Media Player Classic. I noticed that Flash video streams are very sloppy sometimes, though. I haven't figured out why, yet.

Even when under load, the netbook does not get very hot. It's more like a convenient hand warmer than something that would bother you. The fan and the harddisk are quiet, too, although the constant sound of the harddisk seeks can get on your nerves. You don't hear anything at all when there is some background noise (like a TV or something).

The display's low resolution is less of a problem than I initially thought, too. While there is quite a bit of vertical scrolling, it's not that bad because of the two-finger scrolling move (that works vertically and horizontally, btw). Some applications, however, expect a minimum of 1024x768. That's why you can switch to this resolution and scale it to 1024x600 (which looks ugly, of course, but may be necessary for some buttons to be reached).

But the CPU's missing performance shows in heavy computational tasks such as video transcoding. As I said, 720p videos are not playing without disturbance, so I tried to recode it to 480p using Windows Live Movie Maker (since that's what I had at hand). On my desktop system (a Core i7 860), it took something like 7 minutes, while the netbook was working on it for 1:15 hours. Admittedly, it's not very fair to compare the N150 to a system in which the CPU alone costs as much as the complete netbook.

Gaming

I would never have thought that a netbook could be such a neat gaming device. Of course, it's only older games or games with 2D graphics that run on the N150. But since there are awesome classic games, you can have quite a lot of fun with gaming on this netbook. For instance, you can run Quake III in 1024x600 with something between 30 and 80 fps. Yay! You can play Diablo II on it, too, but it becomes very sloppy, sometimes only one frame every few seconds, when there are many enemies on screen. Go check abandonia.com for old games for free, such as Nightmare Creatures or Blood. There's also gog.com that sell classic games for relatively low prices. Modern "mini-games" like Plants vs. Zombies (you can get it on Steam for half the price) or the extremely famous FarmVille (on facebook) are perfectly playable, too. I even play Beat Hazard (on Steam) using a wireless XBox 360 Gamepad. But of course the device has some limits. I tried Torchlight, a modern Diablo-clone from the original Diablo programmers, but the game was running very sloppy and not playable, although it has a "Netbook mode". You need an Ion-netbook to run it smoothly, I guess. Basically every game released before 2000 should be ok. For later games, I'd try the demo before buying it. I ran the oldest available 3DMark, 3DMark03, and scored 670 points. Doesn't sound amazing, but seems to be enough :-)

Conclusion

I am very happy with this device. The price-tag is friendly to your wallet, the size is portable and the material quality is great. There are a few drawbacks (such as not being able to turn off the camera and a missing middle mouse button), but when you can live with them, there's nothing you can complain about. I have had much fun with it so far, and am sure it'll be a good companion on my trip to Japan. Great buy!

Thursday, April 15, 2010

C++ Trivia: Another reason to avoid function-style casts

I was asked to solve a problem that two of my collegues were puzzled by. The source code was short and minimalistic, but the error was not very obvious. Here's the code (modified, because the original included dependencies to Qt):

#include <iostream>
#include <string>

class Test
{
public:
   enum Enum
   {
      First,
      Second,
      Last
   };

   Test(Enum e): m_value(e) { }

   std::string name() const
   {
      switch (m_value)
      {
      case First:
         return "First";
      case Second:
         return "Second";
      case Last:
         return "Last";
      }
      return std::string();
   }

private:
   Enum m_value;
};

int main()
{
   for (int i = 0; i < Test::Last; ++i)
   {
      Test test(Test::Enum(i));
      std::cout << test.name() << std::endl;
   }
}

Can you spot the error without compiling? Don't worry. Even compilers do not agree on this one. Visual C++ 2005 and gcc 4.4.1 both spit out a similar error message in line

"std::cout << test.name() << std::endl;"
. gcc, by the way, has the more helpful error message here:
test.cpp:39: error: request for member ‘name’ in ‘test’, which is of non-class type ‘Test(Test::Enum)’
Does this help you? Have a close look at the code in the for-loop. Only look at the solution when you are out of ideas!

Klick here to see the solution.

The statement
Test test(Test::Enum(i));
is a function declaration. There is a rule in C++ that everything that can be read as a function declaration will be a function declaration. In this particular case, it's a function returning an object of type Test, taking an enum value of type Test::Enum with the parameter name i. It's important to know that you always can put parameter names in parantheses. This means that
void test(int i);
is the same as
void test(int (i));
The failed attempt to cast i to a value of type Test::Enum lead to a construct that looks like a function parameter. So test is a function name, which obviously can't be used in the manner of test.name(). Hint: This wouldn't have happened if static_cast had been used.

Saturday, April 10, 2010

Change wallpaper background image on Windows 7 Starter for netbooks

This came as quite a surprise. I knew that Windows 7 Start might have some limitations. Luckily, the limitation of started applications has been dropped. But the only really annoying thing that it is missing is the ability to change your wallpaper. This is really a bummer and I can imagine that quite a few people will pay the $80 bucks for an Anytime upgrade to Windows 7 Home Premium just for this reason. But I don't want to throw my money at Microsoft for such a silly reason, so I investigated whether it is possible to change the wallpaper through some registry-hacks or something. I found out that many just replaced the original image and it worked for them. They were using older (pre-release) versions of Windows 7 Starter and this has been "fixed" in the final release. So the only way is using tools like WindowBlinds by Stardock. However, this tool will replace the complete look and feel of your Windows installation. Since I like the original look and feel and don't want to replace a large part of my system with 3rd party software, so I looked into more possible solutions. The best tool that I could find is available at a french site and called Starter Background Changer. This tool replaces the "Personalize" screen of Windows with a home-made interface (that has a few funny translations, at least in German) that lets you change your desktop's background image and some other settings. Here's a screenshot of the interface in German:

Starter Background Changer screenshot

Thursday, February 4, 2010

Hardware price/performance guides for processors and graphics cards

What started with graphic card manufacturer's mangled naming schemes has long been continued by the CPU manufacturers. The days where you knew that a "GeForce 3" was faster than a "GeForce 2" and a Pentium 500MHz is faster than a Pentium 400MHz are gone now. CPU's clocks and number of cores can not be measured in a linear fashion, hence price comparison became very hard in the last few years. Luckily, a few tools are available to us consumers that make this a bit easier.

CPUs

First of all, there's a very neat CPU price/performance comparison list at pulsiageek's site. It's especially useful because you can sort it by price, performance or price/performance-ratio.

Graphic cards

Then there's a list of graphics cards containing cards from the old 2MB 3dfx Voodoo graphics card up to the latest GeForce GTX 295. The list over at gpureview.com does not contain a benchmark result, unfortunately, but it contains the MSRP.

Mobile graphics

Mobile GPUs are a whole different story, so it's good to know that there is a separate list at notebookcheck.net that contains the 3DMark01, 3DMark03, 3DMark05 and 3DMark06 score, which is probably the easiest indicator for performance. While there are numerous restrictions you can apply to the list to filter out specific graphics cards, there is no MSRP or reseller price mentioned. There's also a list available that includes actual FPS rates for popular games for each mobile GPU.

Tuesday, February 2, 2010

Scroll to the last highlight in irssi

Perhaps you know this situation: You have irssi running in screen and have been away for a few days. You come back and see that you have been highlighted a few hours or maybe even days before. Irssi usually shows you the highlighted line in the server window and it may be something like "daniel: What? Nooo!". Now you're wondering "What the heck was he refering to?", since you don't remember the conversation at that time. Now you would start scrolling back, probably hundreds of lines. When you are connected to your irssi via ssh over a slow line, this may take a second or more per page. It often happens to me that it takes 3 minutes or more to scroll back, just to find out that he mistyped another nick.

I now finally learned about a way to scroll to the highlighted position directly. With the command

/scrollback goto [dd-mm] HH:MM
you can scroll directly to a point in time. And since the timestamp is usually displayed next to the highlight in the server window, it's possible to jump straight to the highlighted position. You can then use
/scrollback end
to go back again to the most recent message in the window.

I was annoyed by that tedious scrolling task for years now, and asked in #irssi every now and then about it, but until today nobody could give me a solution.

Thanks to jink from #irssi on freenode for telling me about /scrollback goto!

Wednesday, January 6, 2010

Caviar Green, WD10EADS: Green is not my color

I recently bought a completely new PC, because my water-cooling leaked and I was fed up with it and wanted to start from scratch. The new system has top-notch components that deliver extremely fast performance in games, video-encoding, etc. I "accidentally" bought two 1TB Western Digital "Caviar Green" (product-code WD10EADS).

These disks are more than mysterious. Nobody seems to know how many rpm they spin at. Some resellers list them with 7.200rpm, others with 5.400rpm. There are even specifications stating "5.400-7.200rpm", which could be an indicator that they are "Green", because they spin down when not heavily used. Seems like a nice technology - get all the performance when needed, keep noise and power consumption low when not. It's the first time I've heard about this and it's still only an assumption. Western Digital states disk-speeds for their blue- and black-series with 7.200rpm, but the green-series is just missing the specs.

My actual experience with these disks is... frankly put, horrible. I created a RAID1 of two of these disks and made one 100GB partition for my Windows 7 installation and the rest for game-installations, movies and all the other stuff. Windows-installation was quite smooth (if that's even possible), but system startup and program startup times were disastrous. When the system booted with nothing but Firefox installed, while it was still doing it's usual post-login stuff, starting Firefox took something around 3 minutes. This, of course, does not apply when the system startup was completed and the disk was idle. My impression is that overall data throughput is quite ok with these disks, but seeking and concurrent accesses totally kills performance. I'm sorry I did no benchmarks with actual values I could present, but my subjective impression was more than enough for me to decide that I'll return these disks. I think I'll choose the Western Digital Black series instead (which has double the power consumption according to the manufacturer).

Conclusion

Don't get me wrong - I don't think these disks are bad overall. They are just put to use in the wrong place. I can see good usage for them in external usb-housings or as a second data-drive in an office computer. Or an internal or external backup medium. But not as a system drive or a drive that is accessed frequently and concurrently. What I am most angry about is Western Digital's information policy. Why can't they clearly state how the disks work and how much rpm they spin at at what conditions? Holding back information about your products is nothing I can accept.