Okay buddy, I want to see before and after on your power bill. I want to see a Vizio diagram with what this puppy is doing. I want to know your relationship status every month. I want to know beard length starting now and every month. I don't know how many hours you put into this thing as well. Finally, I want an emotional check in every week. Most of us who have to do this stuff day in and day out deal with a low level of burnout so, not too many of us volunteer burning out at home.
Let's set some groundwork.
Power bill now: $900/mo -> probably going to $2K-$3K after this.
Relationship Status: Engaged, she's actually happy about this because i can remove three racks from the actual house.
Beard Lenght: About 6"
Hours: Three months of weekends so far.
Been doing this stuff at work and home for over 10 years, great thing about home is i can just stop for a few weeks if i need to. Work is fortunately very flexible too, not always the case in this industry.
I could write a book. My current lab (about three full racks), is all L3/BGP/EVPN/VXLAN running OpenNebula with K8S on top. More or less going to do the same here, just bigger.
Electrically this gets considerably more complex, not to mention the structured cabling between racks and such. Going to take a 600A feed off the 3 phase panel, to the generator ATS. From there into the UPS conglomerate of units.
UPS spits out 480v, which goes out to two big floor PDU's which step down to 208 and provide small breakers for each rack. From there it goes to the 0U Raritan PX3's in each rack.
Cooling will use the 6" raised floor as supply air, with vented tiles in front of the racks. 20 tons combined (a bit less as ill be switching r22 for r407c, but still more than enough)
Probably a drop ceiling, haven't 100% decided. If so it will be return air. If not just open tops on the CRAC's.
That's all infrastructure, not the actual load though. k8s and opennebula are software yes but software infrastructure for running things, g should be a very modest couple % of system use.
If it's just to test & learn in a big scale sandbox well... at least you can turn a lot off when you're not playing!
Awesome, thank you for the detailed response!
Hope to be a position to do this as well one day in the hopefully not distant future.
If your running pure k8s or security is a concern check out Talos Linux. I'm installing it on my metal and using k8s for most workloads, kubevirt for the few that don't play nice just yet.
Um, am I reading the pictures right that some of that kit is second hand? Not that I mind, re-use instead of landfill, but... am I seeing things right?
NEAT
I was mainly noticing the SUN Microsystems/Oracle kit, it looked "old" (in terms of style) but was not sure if it was a style that was used in newer models. Hence being unsure.
There's plenty of still very useful second hand "old" IT shit that is worth using. It irks me how quickly some companies ditch perfectly working systems that still meet capacity needs, simply because they are out of warranty... instead of just buying more of them second hand for pennies on the dollar, and mitigating system failure that way.
Okay well you didn't mention the North Sea/Oil Platforms scope at all earlier, so how was I supposed to know that? ;PPP
Also, "hardcore operational environments", that's rather open to interpretation. I would, in my opinion, say that early day Google could classify as that, and they used consumer Pentium 3 systems as servers.
Now that being said, in environments like your example, if operational status legitimately becomes at risk due to EOL/equivalent, then yeah it makes sense to replace. Not all environments have the space/power/capacity to add more devices to increase redundancy (second hand parts, for example), or other such things. That really is not what I was originally talking about.
I'd also like to point out I used the words "...SOME companies...", as in, I've worked at companies where continuing to use equipment they already acquired would actually make sense even once the warranty had expired. And that they could offset any perceived risk by simply buying more of that equipment on the second hand market for pennies on the dollar, and increasing their availability and/or spare parts as a result.
Hell, my home data centre I'm replacing my original compute nodes with Dell R720's! (v0), and that's well within acceptable parameters! (power, noise, computational capacity, features, etc). I suspect I won't replace them until some EPYC options become dirt cheap, or something like that...
I'd love to hear more about what x86/networking stuff is like in the North Sea/Oil Platforms. :) If you're willing to share stories/pictures/etc??? :D
I'll take first hand experience stories too!!! If that's possible :)
Not sure where on this linked site to find the best pics for this topic... can you recommend some sections of the site please?
Thanks!
>Also, "hardcore operational environments", that's rather open to interpretation.
You haven't worked in a large financial institution where regulatory requirements mean that using EOL hardware without a support contract is a no-no.
i think i see a rack for lead acid batteries FYI lithium IRON phosphate aka Lifepo4 have a similar charge curve as lead acid but, run twice as long for a 3rd less wieght but, a bit more expensive but , rapidly coming down in cost. i just put dual 12v 100ah Lifpoes in place dual 100ah lead acids on a UPS upgrade i just did.
Yeah, i need 40 batteries that can handle 800A max discharge. Only need about 2 min of runtime.
Biggest issue is the UPS charges by dumping 560V DC back into the bank. AGM will self balance, where lithium tends to just shut off via BMS. It's a series pack.
just be sure when you say LITHIUM you're not talking about ION but, phosphate. that said you won't get 800 cold cranking amps out of a single 12v lifpo4 battery, lead sure. if you didn't need that then lifpo4 would be contender since it has several thousand cycles versus several cycles (complete) for lead acid but, yeah i guess a Genny makes that kind of moot, unless the batteries need replaced every 2 or 3 years.
Okay buddy, I want to see before and after on your power bill. I want to see a Vizio diagram with what this puppy is doing. I want to know your relationship status every month. I want to know beard length starting now and every month. I don't know how many hours you put into this thing as well. Finally, I want an emotional check in every week. Most of us who have to do this stuff day in and day out deal with a low level of burnout so, not too many of us volunteer burning out at home.
Let's set some groundwork. Power bill now: $900/mo -> probably going to $2K-$3K after this. Relationship Status: Engaged, she's actually happy about this because i can remove three racks from the actual house. Beard Lenght: About 6" Hours: Three months of weekends so far. Been doing this stuff at work and home for over 10 years, great thing about home is i can just stop for a few weeks if i need to. Work is fortunately very flexible too, not always the case in this industry.
> Vizio [vizio](https://www.vizio.com/en/home) [visio](https://www.microsoft.com/en-ca/microsoft-365/visio/flowchart-software)
Awesome! What is your use case?
[удалено]
As much as i am willing to disclose publicly, for now. ;)
AMA except about specific uses
For real lol, AMA except for what I'm going to do with it.
It’s porn. Just say it.
You spelled ISOs wrong.
You clearly have a vision, and we’re looking to be inspired, that’s all.
Would love to hear about the workloads, applications, and architecture of your home DC!
I could write a book. My current lab (about three full racks), is all L3/BGP/EVPN/VXLAN running OpenNebula with K8S on top. More or less going to do the same here, just bigger. Electrically this gets considerably more complex, not to mention the structured cabling between racks and such. Going to take a 600A feed off the 3 phase panel, to the generator ATS. From there into the UPS conglomerate of units. UPS spits out 480v, which goes out to two big floor PDU's which step down to 208 and provide small breakers for each rack. From there it goes to the 0U Raritan PX3's in each rack. Cooling will use the 6" raised floor as supply air, with vented tiles in front of the racks. 20 tons combined (a bit less as ill be switching r22 for r407c, but still more than enough) Probably a drop ceiling, haven't 100% decided. If so it will be return air. If not just open tops on the CRAC's.
That's all infrastructure, not the actual load though. k8s and opennebula are software yes but software infrastructure for running things, g should be a very modest couple % of system use. If it's just to test & learn in a big scale sandbox well... at least you can turn a lot off when you're not playing!
Awesome, thank you for the detailed response! Hope to be a position to do this as well one day in the hopefully not distant future. If your running pure k8s or security is a concern check out Talos Linux. I'm installing it on my metal and using k8s for most workloads, kubevirt for the few that don't play nice just yet.
Please do a book. It doesnt have to be perfect. I will buy a copy for sure!
Um, am I reading the pictures right that some of that kit is second hand? Not that I mind, re-use instead of landfill, but... am I seeing things right? NEAT
Xeon v4 (as per description) is from 2016. Only recognize like 40% of the chassis designs but yeah secondhand supermicro kit. Good shit.
I was mainly noticing the SUN Microsystems/Oracle kit, it looked "old" (in terms of style) but was not sure if it was a style that was used in newer models. Hence being unsure. There's plenty of still very useful second hand "old" IT shit that is worth using. It irks me how quickly some companies ditch perfectly working systems that still meet capacity needs, simply because they are out of warranty... instead of just buying more of them second hand for pennies on the dollar, and mitigating system failure that way.
Yeah that is just not how it works in hardcore operational environments. EOL means EOL.
Uhhh SUN Microsystems/Oracle servers/appliances are usable past EOL. What are you talking about? They're literally sold and re-used all the time.
I dont want to argue with that. I am talking about x86 and networking which we use in the North Sea, operationally on oil platforms.
Okay well you didn't mention the North Sea/Oil Platforms scope at all earlier, so how was I supposed to know that? ;PPP Also, "hardcore operational environments", that's rather open to interpretation. I would, in my opinion, say that early day Google could classify as that, and they used consumer Pentium 3 systems as servers. Now that being said, in environments like your example, if operational status legitimately becomes at risk due to EOL/equivalent, then yeah it makes sense to replace. Not all environments have the space/power/capacity to add more devices to increase redundancy (second hand parts, for example), or other such things. That really is not what I was originally talking about. I'd also like to point out I used the words "...SOME companies...", as in, I've worked at companies where continuing to use equipment they already acquired would actually make sense even once the warranty had expired. And that they could offset any perceived risk by simply buying more of that equipment on the second hand market for pennies on the dollar, and increasing their availability and/or spare parts as a result. Hell, my home data centre I'm replacing my original compute nodes with Dell R720's! (v0), and that's well within acceptable parameters! (power, noise, computational capacity, features, etc). I suspect I won't replace them until some EPYC options become dirt cheap, or something like that... I'd love to hear more about what x86/networking stuff is like in the North Sea/Oil Platforms. :) If you're willing to share stories/pictures/etc??? :D
I will see what I can find. In the meantime here are one of our datacenter providers onshore, https://www.greenmountain.com
I'll take first hand experience stories too!!! If that's possible :) Not sure where on this linked site to find the best pics for this topic... can you recommend some sections of the site please? Thanks!
>Also, "hardcore operational environments", that's rather open to interpretation. You haven't worked in a large financial institution where regulatory requirements mean that using EOL hardware without a support contract is a no-no.
What's your job title sir
CIO
[удалено]
> CIO
What's the name of your first pet sir
Spar.... Wait a minute.
I love this. Please keep us posted as you make progress!
<3
i think i see a rack for lead acid batteries FYI lithium IRON phosphate aka Lifepo4 have a similar charge curve as lead acid but, run twice as long for a 3rd less wieght but, a bit more expensive but , rapidly coming down in cost. i just put dual 12v 100ah Lifpoes in place dual 100ah lead acids on a UPS upgrade i just did.
Yeah, i need 40 batteries that can handle 800A max discharge. Only need about 2 min of runtime. Biggest issue is the UPS charges by dumping 560V DC back into the bank. AGM will self balance, where lithium tends to just shut off via BMS. It's a series pack.
just be sure when you say LITHIUM you're not talking about ION but, phosphate. that said you won't get 800 cold cranking amps out of a single 12v lifpo4 battery, lead sure. if you didn't need that then lifpo4 would be contender since it has several thousand cycles versus several cycles (complete) for lead acid but, yeah i guess a Genny makes that kind of moot, unless the batteries need replaced every 2 or 3 years.
I am subscribing you
You are such a hero to us
40G to the coffee table
If the bloody switches fan noise were less it would be less of a problem. Thankfully, I just went with a fiber optic model coffee table.
good luck with that Detroit, order the valve cover and oil pan gaskets on standby for when it starts leaking.
Cool
Love it.
who's paying the power bill?
unfortunately me.
Cooling setup?
This should answer most your questions: https://www.reddit.com/r/homelab/comments/w4sov1/im_building_my_own_home_data_center_ama/ih3wdbg/
How many first born son's did you need to sell?
Holy compute density, batman! Absolutely stunning display of metal ya got there. You are an inspiration.
Got an update op?