Now that USB flash drives are cheap and common producers clearly need to get creative to compete. Here are a couple great ideas for storage with a more personal touch:
Stuff this cuddly USB bear with 1GB of photos, music or documents.
I have to admit this is the first blue fortune cookie I’ve ever seen, but imagine how many fortunes you could fit in this USB fortune cookie
Also from the freshly baked line the USB Hamburger may not be a half-pound burger, but it is a 2GB drive!
Of course none of these beat the USB Humping Dog but you can’t store files on that (and let’s face it, it might not quite be appropriate for that executive board meeting.)
Thanks to Don for the USB bear.
It looks like Oracle has decideded to adopt Apple’s XServe RAID as a low-cost storage solution
Based on our own experience with Apple technology, Xserve RAID is a great match for applications running Oracle.
With the appetite of one to two petabytes annually Oracle is of course looking to control costs and the SATA-based Xserve RAID combines value, capacity, performance and availability. The current top configuration offers 7000GB on dual RAID controllers with redundant power and cooling for a mere $12999. Sure that’s the price of a small car, but it works out to only $1.86/GB for some serous industry-level storage.
Thanks to Zach for sending this on to me.
apple, oracle, xserve, raid, xserve raid, storage, disk, hard drive
Need to know how much space a directory and its contents are taking up on your UNIX system? Here’s what I use:
du -ks directory
du command is used to summarize disk usage. Without any flags will show you the usage in blocks for every directory and subdirectory specified. Since the number of blocks varies by operating system we add the
-k option to specify that we want the output in kilobytes. In many operating systems you could also use
-h for a “human-readable” output with abbreviations like B for bytes, K for kilobytes, M for megabytes and so on.
-s option lets us gather only the sum of the directory specified. Without the
-s flag we would get output on every subdirectory as well as the specified directory.
$ du -k stuff
One other thing that is useful for finding the biggest files and directories where there are a lot to sift through is to use a wildcard to size up multiple directories, then pipe the output of
du to the
sort command like this:
$ du -ks ./* | sort -n
sort we use the
-n option to order things by arithmetic value rather than alphabetic value (making 8 come before 304) so we see the largest things at the bottom.
Try it out. As always check the
man pages for more info.
For more tips like this check out my book Easy Linux Commands, only $19.95 from Rampant TechPress.
unix, solaris, linux, sysadmin, system administration, storage, storage administration