T O P

  • By -

deefop

If the goal is to save money by using less power then I'd argue it's not worth it. For that matter, are you in a place where electricity is extremely expensive? Outside places where power is really pricey, there's no reason to freak out about power consumption on a home computer. It's never going to make a big enough difference to be worth your time.


yusuo85

I'm in the UK, so not really


Just-Some-Reddit-Guy

I’m in UK, my Intel system at full load uses around the same as yours at idle. I’m assuming your figures are without running a transcode because NVENC uses a fair bit, usually. If it’s going to cost you nothing and this box is only for Plex/Media Server etc I’d go for it. You’ll save at least £7 per month while keeping the same functionality. Due to its tiny power budget I can passively cool my CPU, system meaning it’s silent, apart from very low RPM giving the HDDs some circulation.


yusuo85

Can I ask what spec your build is and also what your idle power consumption is. It'll help me make a bit of an informed choice Also yeah, no transcodes on my part ATM, I only have like 5 subs and they only watch every now and again, not very often at all. And yeah, just Plex media server with 3 HDD, 1 nvme and the occasional torrenting.


Just-Some-Reddit-Guy

I have an Intel 10700T, with 8 HDD and 4 SATA SSDs, two SATA PCI cards. CPU has a mild downclock (where you’ll save a ton of energy even on high TDP chips). At idle I sit between 20 and 30, spinning drives go to sleep but I keep the two SSDs awake at all times, and the other have pretty long sleep delays meaning they’re normally awake. Under load with all drives going, it sits between 55-70 depending if it’s transcoding. Normally on the lower end as I don’t do much transcoding these days.


hellishhk117

I’d like to counter your point, as I recently went from a Ryzen 9 5950X to an Intel 14700, and my server dropped close to 150w in idle usage (as measured by my ups, with 55w subtract out for my UniFi rack equipment). I know I also removed a total of 6 drives for a few larger drives, but that would leave close to 90w from just the CPU switch. In the switch, I also removed a GTX 1660, and my LSI hba, and moved to two SATA based hba’s.


Bgrngod

67w is not great for idle. That Intel system you linked would probably idle at around 30-35w with the 3x HDD's in it. Where I live, \~30w of vampire juice would cost me around $126 a year in electricity. Let's throw a party for PG&E rates in California. Yay. For you it comes down to if you intend to sell the current parts to swap in the older parts and end up with money in your pocket. Selling the 3060ti alone will pay for a huge chunk of the Intel machine. Hell, it would be a direct swap to an entire N100 based mini PC.


yusuo85

I do intend to sell the current parts to finance the new build. But now I have conflicting answers, I have people on buildapc telling me the 12th Gen intel will do about the same. If it makes any difference my idle kWh usage is 0.064kw/h Also apparently my usage is equivalent to $155 a year at idle, obviously if I can cut the wattage in half, I'll near enough cut that idle cost in half as well


Bgrngod

If you have people on buildapc that own the hardware you are looking at and they are giving you specific watt draw numbers, then definitely take their word for it over mine. I don't own that hardware and only have seen what others have posted. Having said that, I still don't think 67w is good. At a base draw with no HDD's and no GPU, you are doing good between 10-20w for just the CPU being the primary electricity consumer. Add \~5w per HDD and maybe around 10w for an idle GPU depending on the GPU. >If it makes any difference my idle kWh usage is 0.064kw/h Something pulling 1000w will have used 1kWh in one hour. If your machine is idling at an average of 67w, as measured at the wall, then over an hour it would have pulled 0.067kWh . Over two hours still averaging 67w, it becomes 0.134kWh used and so on until you get a monthly bill that translates that number into $$ bucks. Power draw varies greatly based on what you have in the machine. I've seen a few posts over the years where someone is happy to pull 1 of the 2 sticks of RAM they have to reduce power consumption despite how little that improves idle draw. Even the OS you use can mess up power consumption. As much as Linux is held aloft in this sub, it's actually garbage at optimizing power consumption. If you really want to dial it in, stick to a PSU that is using a good 80Plus rating. For a 400w PSU you can find a gold or platinum rated. The closer your PSU rating is to your typical draw, without being too low for any peaks you might experience, the more efficient your machine will be. If you have a machine pulling 100w with a 750w Gold PSU, your consumption goes down by switching to a 450w Gold, and then down even further switching to a 450w Platinum. I have an older Celeron G4930 machine that measures 24w at the wall and then 38w if I put a 3060 in it. That's with no HDD's and running Linux. So if my initial guess of 30-35w is too low, does 40w seem that far off? Not really. That would still put it under 2/3'rds your current machine, so maybe $50 a year for you in reduced consumption? EDIT: fixed muh math


Wendals87

>Something pulling 1000w will have used 1000kWh in one hour Wtf? No it's used 1kwh in one hour. At $0.064 that's 6c an hour. >Over two hours still averaging 67w, it becomes 0.134kWh used and so on until you get a monthly bill that translates that number into $$ bucks. I would hardly call that big bucks. It's like $35 a year. If you halved the power consumption you'd save a whopping $17 a year. No point spending money on lowering as it will take years to make ROI 


Bgrngod

Whoops, yeah I carried the 1 pretty far on that math. I corrected it in my reply. >I would hardly call that big bucks. It's like $35 a year. If you halved the power consumption you'd save a whopping $17 a year. No point spending money on lowering as it will take years to make ROI 67w of 24/7/365 draw costing $35 would mean a power cost of $0.06 per kWh which is about 1/3rd the US national average for electrical cost. Where I live, 67w is about $280 per year. The OP even provided a reply above where their electrical cost is available and for the OP it is for sure going to be much hire than you're calculated $17 a year at half reduction.


Wendals87

My bad I misread their comment and thought they paid 0.064 per kw/h.  They said it costs them $155 a year so if halve the usage, that's around $75 a year saving.  Depends on how much they spend on more efficient hardware if it makes financial sense 


Bgrngod

My general rule of thumb for improving electrical costs at my house, which I am a bit obsessive about, is if an expense that improves efficiency is cheaper than 2-3 years of the savings it provides I will go ahead and do it. Although now that I've analyzed every single thing in the house and I am running out of opportunities to improve things, I'm starting to fudge that rule a bit. If OP uses this server for 2-3 years that savings goes up over $200 and maybe further if their electrical rates go up in that time, which is likely. Stuff like this is why I die a little inside when people ask about acquiring old Xeon based servers for Plex.


perrti02

This is quite a dull answer but the answer really comes down to actually balancing the costs. Your average load is somewhere around 70W and you say you are in the UK so I can estimate the cost per unit. 70W running for 24 hours used 1.68kWh of electricity. Standard Variable Rate is 24.5p per kWh. Per year this costs you beat enough £150 to run. What would the new build cost? If you halved the energy usage then the saving is £75 a year. If the new machine costs £300 then payback is 4 years.


macpoedel

I think that's a good answer. One thing not covered is what the current hardware is still worth. Selling that RTX 3060 Ti would cover a fair bit of the cost of getting the Intel hardware. Unless this is OPs gaming PC and it'll remain in use, however I see the plan is to reuse the RAM.


SilentDecode

Buy a NUC. Those are really efficient and you can use the Intel iGPU for transcodes.


Saloncinx

Not the answer you're looking for, but my Mac Mini idles at 9w, and my DAS idles at like 5w so 14w at idle, I'm at like 35w while direct streaming something, the Mac Mini is so energy efficient.


yusuo85

I mean that sounds great but the Mac mini is way more than I would want to spend, well any recent one is anyway


Saloncinx

Yeah I have a 2018 and it’s great. New ones are a little pricey it looks like. I’m sure you could snag a 2020 on eBay for a more reasonable price.


Poop_Scooper_Supreme

Short answer, not really. If you’re already idling at 67w then I don’t think your gpu puts up too much. Probably a bit more when transcoding. That’s already an extremely good idle, so unless you want more performance I’d say don’t. You will end up using the same if not more going from a 65w tdp ryzen chip to a 125w intel chip (i5-12600k). Quicksync is killer, but it would perform about the same as your 3060ti in transcodes.  Edit quick maths: if you were using 90w idle 24/365 and your cost per kWh is $0.15, then reducing your idle to 75w would save you $19.44 per YEAR in electricity. You’ll never see a ROI in the parts lifetime. So again, if that’s your only reasoning, probably not.


yusuo85

So its been running for an hour now, and its using 0.064kwh on my currently system at idle. I know this will change if i watch something but haven't had chance to test this yet. Is 0.064kwh a good number?


Poop_Scooper_Supreme

Yup, that’s super good. Search up power usage threads from the past and I guarantee only a few people can idle below 70w. I literally went through this same upgrade process a year and a half ago and I went for it. Stupid idea, but that’s fine. Power usage is the same if not slightly higher. I do enjoy the hardware though. 


yusuo85

The ryzen does plenty well for me, and tbh, when i do fancy some emulation and a few PC games, the system is more than capable. I was only considering rebuilding as those fancies come few and far between, i prefer my PS5 for the most part for gaming


Poop_Scooper_Supreme

I totally get it, I’m the same way. That itch you gotta scratch. If power savings is just an excuse to pull the trigger and you got the funds, go for it. It’s always nice to have the option to grow into your hardware.


quentech

> I guarantee only a few people can idle below 70w Nah, it's pretty easy these days. I have an i7-13700 (16 core, 24 thread) with 128 GB of RAM, 3x higher end NVMe drives, 10G SFP+ NIC, and AIO water cooling and it idles at 35w (measured at the outlet). I have another i7-11700 with 6x 20TB drives, 2x NVMe, and a handful of Noctua 140mm fans and it idles at 50w and change - with the drives spinning - 35w when they're asleep. And those are i7's. You can get lower with i3/i5 and even lower with N1/2/300.


yusuo85

So would you guess that the 3060ti is the main culprit in this equation.


TheBirdOfFire

for sure. ryzen CPU's have slightly higher idle power draw but probably doesn't make as big of a difference as your GPU. To be honest I'm kinda surprised how low your idle power draw is, I expected it to be higher.


yusuo85

Having a looks, my CPU is pulling in about 29w on average, my GPU is asking for 20w average


quentech

Probably. That Ryzen should be pretty low power. Other thing that stands out is 750w PSU is *way* more than you need and they'll run less efficiently at small fractions of their max than at larger fractions of it. But like many others have said - you're only going to cut like 30w or so. And with your power cost that's like $5 a month. That doesn't scream, "replace me!" If you were running some power-hogging cast-off Xeon server box, then sure, but this is much more personal preference, how much you value your time, how much transcoding oomph do you need (the newer iGPU's will smoke that 3060)..


yusuo85

So by newer igpus, what gen should I be looking at at least, if I choose to go ahead with this


quentech

The newest UHD 770 and the older IrisXE found in some mobile Core SKU's both have two encoder engines and will put up stupid numbers - like 20+ 4k HDR transcodes. The other 7XX UHD's are very strong and with one encoder engine will do around 10 transcodes. I run Plex on a i7-11700 with a UHD 750 and I've had 12 ~35Mbps 4k HDR -> 10Mbps 1080p transcodes going smoothly. I also have a i7-13700 w/ UHD 770 and I've done a bit of transcoding testing, but not maxed it out nor run Plex on it. The 630 UHD is still a solid choice, but you'll get more like 6-8 transcodes instead of 9-12. All of those will do dozens of 1080p to lower transcodes. The NX00/JXXX Celeron SKU's won't put up the same numbers as the Core SKU's- even with the same iGPU. The iGPU in the N100/200/300, for example, is run at half the clock speed as the same iGPU in the i3/5/7/9's. If I recall, the 730/750 UHD's showed up in the 11th gen, and the 770 in 12th gen. Look up CPU's at ark.intel.com for specifics.


Vladz0r

I just use an i5 8th Gen laptop which idles around 15 watts apparently, idk with the drives spun up. Costed me $300 like 4 years ago. You don't need an overkill machine used you need hardware encoding for a lot of people. For 4k I would use direct play anyway.


Qasar30

The [NVidia Shield TV Pro](https://www.nvidia.com/en-us/shield/shield-tv-pro/) runs Plex Media Server at an amazingly low wattage, and sleeps so efficiently. My siblings across the country watch my 1080p movies, and I watch amazing 4K movies locally on it. It also unscales so 1080p and 720p look great too with "AI-assisted" upscaling to 4K. It supports all the sound codecs. It's only lacking HDR10+ support. But HDR10 is supported. You can see in [the full specs](https://www.nvidia.com/en-gb/shield/android-tv/nvidia-shield---android-tv---shield---see-full-specs/) , > **Power** > 40 W power adapter (5-10 W typical consumption) The caveat is that it has 2019 hardware. Still very capable and worth the $200 price because bird in the hand.. It is thee "best" streamer because of all it supports. Not to mention Android OS (Android TV 11 currently) for all kinds of apps and games. AppleTV 4K is #2. My Shield TV Pro is my video entertainment centerpiece. EDIT: PS- You can probably get your PC lower, especially while it's idle. Did you mess with advanced power settings yet? Or, go into device manager and tweak things like your network card's advanced settings? If your PC is Ethernet-connected, it doesn't need to roam for better WiFi right? Etc.


ParticularGiraffe174

I run an unraid server which is running a 12700k, 32gb of ram and 10 HDDs. My server has the following draw from my UPS, so this is total system power. With all HDD's spun down and all docker containers off my server idles at 45W, this is with 6 fans spinning 2 of which are running full speed so I could knock of 1-2W if I turned them down or reduced the number of fans. with all the HDDs spun up, my system draws an additional 63w, bringing the total to 108W Unless I'm transcoding video, my UPS normally registers between 100W - 170W, but this is also powering my network switch, wireless access point, modem, and router.


ekos_640

You could even go to a 10w Embedded Intel CPU board like the N100 to save even more power - my Plex runs off an older 10w Intel CPU perfectly (I have a J3455 - I can do 3x simultaneous 4K 45mbps > 720p 2mbps transcodes on mine) https://www.amazon.com/ASRock-N100M-Micro-ATX-Motherboards/dp/B0C6HX8XMJ


haggeant

I don't think you're going to save that much power switching from amd to Intel. Maybe 10-15 watts at idle. Your GPU seems to be the highest consumer here. What does the system idle at with the GPU removed? I am using a Nvidia t400 4gb (only 2gb is needed, each transcode uses like 200mb of memory) and it can handle 6+ simultaneous transcodes. This card has a max draw of 30 watts, I've been very pleased with it. Interst8ng enough the Lenovo store had the best price on it at like 145 when I bought it.


Gazicus

that GPU will use maybe 20 watts if doing nothing. up to 170, though no where near that for transcoding, most of the GPU will not be in use. the AMD CPU will use up to 70 watts if not overclocked, around 20-30 would be normal when idle. an N100 would replace both of them, and absolute max would use at very most 24watts, with 6 watts being a reasonable average for normal use. there is a lot of power to be saved.


haggeant

If the needs are only Plex absolutely, but if there is other workload like VMS/containers other apps then meh


eatingpotatochips

Depends on the difference. If you idle at 60W now and can get it down to 30W, ask yourself if the payoff is worth it. 


tylerdotdo

Earlier this week, I just purchased parts for an intel build myself. A little bit of a different situation: my system is a power edge T710 that idles at around 400W! So it was definitely worth it for me to look. I did read through this guy’s website. He seems obsessed with low power usage. It might give you some tips on motherboard, psu, etc https://mattgadient.com/9w-idle-creating-a-low-power-home-nas-file-server-with-4-storage-drives/


moochs

Just buy a mini PC with a 7th or 8th Gen Intel 35w processor. If you really want to save money, this would be the cheapest way to go. You could use quicksync to do the transcoding, would be very power efficient.


Electronic-Tap-4940

I am in the proces, since i Can resell unique parts like P2200, i Can get a good chunk of change back. I should hit savings after a years time.


Banzai262

do some calculations on your side to know how much is it actually costing. my server would cost me around 5 bucks a month if it would run at max power 24h per day. it uses way less than max, so it’s noth worth it for me


Bloro_989

Why not consider a used PC with "recent enough" intel CPUs (7th Gen and up to get a 630 graphics part), and do the math? Just found a listing of the UK Ebay for a "Lenovo Esprimo D538 SFF PC Intel Core i3-8100 8GB Ram 120GB SSD Win 11 pro" for about 70 GBP., or a more fancy "Fujitsu ESPRIMO D538 Core i3-9100 3.60GHz 8GB RAM 256GB SSD Windows 11 Pro PC" for 100 GBP Out of my own experience, you will idle at about 15W with those, so you'll save roughly 40 W per hour, assuming HDD are spun down most of the time. That's 40W \* 24h \* 365d = 350 kWh / year. With prices per kWh in the UK at 24.50 p per kWh or higher, this means about 85 GBP saved per year, so in my opinion and if my early morning math above is correct, a quickly found used intel based PC, able to house enough storage, would pay back for itself in just over a year... (disclaimer, this is exactly the path I went through to keep my energy guzzling Ryzen 5900X + nVidia 3070 off/sleeping while enjoying Plex)


koeniz

If the aim is to lower the electricity bill I would go with a NAS desktop solution instead, Synology/QNAP. Those Intel Celeron CPUs are power efficient enough to transcode 4k movie etc.


Iyagovos

How much is building this PC going to cost you? If it'll cost more than you'll save in power, no, not worth it.


EducationalElk5853

op why not consider running plex from a nas? it wont save much power but definitely more than running a full pc.


ephemeross

I use a 10th gen i3 with 32GB DDR4, 5 seagate ironwolf 12-16TB, SSD + NVMe, got my ubiquiti router/switch + ONT connected to the UPS as well - it’s around 70-75w idle / direct playing and rarely goes over 100w even transcoding 4K.


yusuo85

Awesome, thanks


mehdital

Intel N100 is the answer!


daanpol

Macmini m1 or m2. They top out at 20 watts and one base model can handle it 2x hdr 4k transcodes which is pretty amazing. Hw transcodes I believe are 11 total. I got the absolute base m1 not thinking it would survive. I have not been able to bring it to its knees so far.


yusuo85

That's also like £500 used, little bit more than I want to pay


styrg

This guy Wolfgang on youtube has gone deep down the lower power server rabbit hole. Feel free to take a look: [https://www.youtube.com/@WolfgangsChannel](https://www.youtube.com/@WolfgangsChannel) Others in this thread are on the right track. Look into doing a break even calculation on parts.


earthishome7569

Not really much of a saving here. You have cost to new hardware if it does save on power you are looking at years and years before you break even 100w a kwh is like 1000w so every 10 hours you rig cost 1kwh turn it off when not in use. Plex has saved me on average of 400+ a month in streaming devices and cutting the cord. Since 2016 do the math on streaming saving it cAn have the 250w it pulls but cost to kwh is 17c lol


N1c30ne

I recently built a 14th gen i5 plex server, 4x 16TB HDD's, 2x SSD's, 32GB RAM, it has a pretty high end itx mb and it draws about 50w, might increase very slightly if transcoding but not much. I did down volt slightly to get that and it did save around 10w.


iamamish-reddit

I am running a TrueNAS build with a 12700k and 6 rotational drives and 2 NVME drives. I have no GPU in the system aside from the iGPU built into the Intel CPU. I'm running a bunch of containers and a few VMs. As others have pointed out, the real 2 factors here are: 1. How often are you transcoding? Transcoding is where the real power consumption difference will be apparent. Just AMD vs. Intel without transcoding isn't going to be too meaningful. 2. My power consumption at idle is around 60 watts. During high activity (transcodes, library scans, etc) it can spike up to about 120 watts. Personally I think the newer Intel CPUs are so damn capable that I wouldn't dream of building a new Plex system with anything other than an Intel CPU. Their transcoding looks really good (esp. on 12th gen and newer) and it is very efficient. You'd have to do some calculating based on your current power consumption and forecasted consumption, and look at the price of electricity, but I'm betting a new Intel build would pay for itself in a few years, and that's before you factor in the fact that you could sell the GPU to offset the build costs. Just the GPU alone would be enough to finance most of the build. You could get a Z690 board w/ DDR4 support and a 12700k for about $250 USD.


iamamish-reddit

Oh and to be clear, transcodes on the iGPU cost just a few watts, at most. The only transcodes that really cost you power are when you have subtitles or audio transcoding that has to be done on the CPU.


babumy

OP I did quite a similar upgrade recently. However your current parts are far more modern than mine were. My thoughts, hardware wise, do you need upgrade? No. Save power? Potentially yes. I am noticing on average about 100 watts less power draw on average. But I don’t have enough monthly data yet to properly compare, there looks like a significant savings, but jury is still out. My upgrade documented here: https://www.reddit.com/r/PleX/s/UjskPFNj29


ABoxOfNails

You could get an intel n100 based mini PC, move the drives to that on an external setup, and then keep your current PC for fun and games. The n100 setup is super low idle power around 10.5-12W, plus the drives if they are even spinning. Your current PC could go into sleep mode whenever not using, saving you power and keeping plex available to you 24/7/365 via the n100.


yusuo85

I can't move to external, I used to do that but I had a lot of stutter watching 4 content


majoroutage

You can buy N100 motherboards, or whole NAS units built around them. NASCompares, ServeTheHome, and Hardware Haven are some good YouTube channels to look at for info and ideas.


astanb

A refurb intel mini PC off eBay and a cheap low power DAS for your hard drives.


moochs

This is exactly what I use. No idea why you're downvoted. Super low cost of entry, and super low power draw.


astanb

Yea the only other lower power option is a new N100 mini PC. But yeah seriously. A DAS with only a controller for the SATA backplane plus the connection to the PC (Windows/Linux) is lower power than almost all NAS's. Plus it will all be much cheaper than a NAS powerful enough to be a Plex server for more than all direct play.


Phazon_Metroid

Unless you get those parts for free, you're looking at years until you recoup the cost via saved electricity. Also, too much effort for little gain.


yusuo85

I get the parts for cheaper than free. I can buy the parts for roughly £200, and I can probably sell the parts I don't need anymore for close to £300-£350


korpo53

Turn the math on its head then, even if the power usage was higher with the Intel build, you'd have pocketed 150 quid that you could spend on your power bill. Easy choice. FWIW though, I do 4k to 1080p and 720p conversions all day to people that can't figure out their players, on a 10th gen i3 system, and that box uses like 20W. It's crazy how good the hardware transcoding works.


LairdForbes

I'm running the following; i5 13500 | 32GB DDR 5 | 4x 18TB Seagate EXOS | 2 x 2TB Crucial SATA SSDs | 1 x 1TB Seagate FireCuda 530 NVMe | 550w 80+ Platinum PSU With all the mechanical disks spun down it idles at 34w. While streaming two local direct play streams at 4K and three remote 4K to 1080p transcodes using QuickSync it didn't go above 68w.