Setting up a Small Server Room
Ted
ted.leslie-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org
Mon Nov 21 18:44:07 UTC 2011
if your rack price and power hook up price isn't too high, i wouldn't
go with blades (unless big budget and showcase needs),
but sometimes those prices are SO high,
that the blades can be competitive in the long run.
-tl
On 11/21/2011 01:39 PM, Lennart Sorensen wrote:
> On Mon, Nov 21, 2011 at 01:17:09PM -0500, Ted wrote:
>> don't need 2 racks for that.
>> That's a HP blade system, 3/4 of a rack, ups, kvm, redundant, etc.
>> I have designed a few, but i am intrigued by the latest claims
>> of AC free low power servers. AC can be a nightmare as it will
>> eventually break and leave your equipment in a sauna :(.
>> Cost? HP based, with your own storage (as apposed to a NetApp, etc),
>> 90k$ before cooling. If its worth doing, and down time would be
>> costly for you,
> I was going to guess $50000, but I certainly wasn't thinking of blades.
> I know where I work we have one blade server, and if the IT department
> gets to have their way it will be the only one we ever have. They hate
> it for some reason. And I have no idea what AC costs to install either.
>
>> i would recommend only hp or ibm equipment.
> Never had an issue with IBM gear. I don't particularly like HP stuff.
>
>> I would also recommend a additional unit as a backup to your backup,
>> redundancy as dual, means you have none when your first piece fails :)
>> If you want to build it yourself, and go cheap but quality, then 2-3 U cases
>> with asus server boards. (be 60% the price, but use up more rack).
>> Just my 0.02. In my experience with HP blade systems,
>> 5years, 30+ blades, no failures. And multi-year uptimes.
> Can't disagree with that.
>
--
The Toronto Linux Users Group. Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists
More information about the Legacy
mailing list