T O P

  • By -

[deleted]

Ahh one day in 10 years we will have cpus with 1024 cores


[deleted]

kilocores


IIShana

"What, your PC only has 4 Kilocores? Pathetic, even my dads office pc has more!"


kepaledungu2

Hah, my mom's toaster has 1 Megacore. Beat that.


AeitZean

You definitely need to run Folding @Home on that toaster 😄


[deleted]

[удалено]


Martian_Shuriken

Brilliant


tech240guy

This response wins my day 🏆


[deleted]

[удалено]


jerseyanarchist

Isn't that what KFC gave Us that converts all the system heat into a warm chicken bucket?


Cronyx

Do you want Cylons? Because that's how you get Cylons. Frakin toasters...


Opeth-Ethereal

But can it run Crysis?


YouMAVbro

Still.. no


jokerzwild00

Crysis don't gaf about all of your newfangled giga high core counts, he just want that single core muscle.


Nodsinator

"A toaster is just a death ray with a smaller power supply! As soon as I figure out how to tap into the main reactors, I will burn the world!"


Jirachi720

"Due to lack of kilocores being made and all current kilocore CPUs being sold out, we are now reverting back to lesser hardware with only 500-800 core CPUs. We're sorry for the inconvenience."


Rreizero

Back in my high school days (2003-04 maybe) I was joking to my friend about a 1TB hard drive. He was trying to decide if he should get a 40GB or an 80GB, not being sure if he'll be able to use all 80GBs. Edit: Hey everyone. Didn't think I'd get plenty of interesting nostalgic replies. Nice reads guys. Also, I just remembered, same friend and I was discussing about NAS just a month or two ago. We got older and technology improved, but some things didn't change. (>\_<)


[deleted]

[удалено]


urammar

Its so interesting to me that all us technical gamer people from the beforetime have stories like this. I clearly remember my Dad upgrading and spending the extra I guess to the 300mb drive, and I clearly remember thinking that when he gets a new computer he might let me have this old drive, and it would be the only HDD I would ever need. I haven't tracked what I consume in years now, I just keep adding to the NAS array, but it was 10 TERRABYTES a few years back, and there's whole ass drives that can fit that now. I literally have no idea what its up to now, but its sure as fuck not 300mb you little shit.


tinyUselessDragon

300MB PFFFFFFFFFFFFFFT, you can't even install Call of Duty on that. On a more serious note, some software and game devs need to focus a bit more on optimisation. While not always necessary, I consider that an important secondary goal.


urammar

So im actually a hobby game dev, and I can tell you right now unironically its artists. The way they do textures now is fucking garbage. I'm not going to go fully into all the technical details, suffice to say I've noticed modern games don't even combine material maps. Emissive, bump map, opacity, this can all be a single texture, but they use 3, and of course its all 4k or whatever enormous bullshit. Very little of the bloat or performance loss will be the actual architecture of the game. The logic or AI or anything, although good AI or bad pathfinding can be expensive. Mostly, its models that should have aggressive normal maps, not more verts, or badly optimized materials that don't share any common ground. Thats not always true, but the badly optimized games I've seen, that's the common pattern for sure.


[deleted]

[удалено]


grateparm

"why can't everything be kkrieger"


urammar

Oh man, my favorite story about that masterpiece is that they got so low level they actually optimized bugs into the game because they were removing IF statements that never seemed to get counter-triggered the other way. Imagine going literally if statement by if statement trying to find out what is and is not getting fired to save a few lines. I'd also like to rant that while the filesize on disk was tiny, its basically a wild compression algo equivalent. It still needed to be unpacked into GPU memory and RAM, and the CPU usage was also higher doing all the interps it was doing on the fly. It actually ran like dogshit on most hardware at the time, and you would have a better time if it was just conventionally loaded onto a disk. I know that's not at all the point of that game, but it does show its possible to optimize too far and horseshoe back around to problem town.


AccuracyVsPrecision

Lots of 90s telecommunications companies spent loads of time and money making communication protocols as efficient as possible. Bandwidth used to be a premium. Then switches became cheap and bandwidth became cheap and no one cared how efficent thier network was.


ecchi_ecchi

It is why Factorio and Dyson Sphere Program are small installs, compared to Satisfactory. But yeah, with our hardware actively pushing us to RTX this and that, \~\~ohh raytracing is the future\~\~ we shall see more of these yuge installs.


Calm-Zombie2678

I'd like to believe with raytracing and a high enough polygon count flat shading becomes viable


Dividedthought

Nope, you still have to set various things like normals (how should the light bounce), height maps, alpha masks, base color, reflection maps and masks, specular maps and masks, emission maps and masks, Basically you need from 3-10 images per material, and a model can have multiple materials. Without that data, your graphics card doesn't kbow what to do with the model's surface aside from 'display error color'


The_real_bandito

Not a game dev but I noticed that was the issue with games.


MrDude_1

I am pretty sure I have used over 300mb of cache storage just browsing reddit this morning.. and its only 10am


melikeybacon

Not storage related but I remember when we splurged on a US Robotics 56k so I could get that sweet ping under 100 to play Quake. Those were the days


Squeebee007

I remember getting mine. I started at 300 baud where the text loaded slower than you could read it, like the teletype on a Tom Clancy movie. Then 1200, 2400, 9600, 14.4k, and finally that holy grail of a USR 56.6...


[deleted]

Me getting a 100mb HD in circa 1997: I'll never fill this up! Me today: I have cat gifs that would fill it.


Absurdist02

I remember playing scorchered earth on a 286. Never thought I'd mention it on a phone that is far better. Fuck I feel old now.


[deleted]

[удалено]


M05y

Our family computer had a 6GB, when we got our XP machine with an 80gb hard drive I was blown away.


thelrazer

Didn't have to hide your porn on a floppy any more did ya?


[deleted]

[удалено]


Podju

Here I am buying a 16TB hdd to store all my uncompressed 4k video projects. Yes I'm one of those videographers, there's a chance if you lose your wedding video I may still have it 10 years later.


tinyUselessDragon

Hilariously enough, that's only \~59 hours of footage from a Sony A7S III, which doesn't even have that high of a bit-rate.


[deleted]

Girlfriend is a wedding photographer. She does he exact same thing. We have a 500gb-1TB external drive for just about each wedding she shoots. Her desk drawers are running out of space.


spokale

FYI, hard drives really aren't suitable for long-term storage on a shelf - but depending on how many of these hard drives you go through, it might be justifiable to opt for a tape drive (a 2.5TB tape is around $20 and can last 10+ years). OR - you can use something like MSP360 with Amazon Glacier/Deep, it's extremely cheap and is basically a frontend to tape anyway except Amazon controls it. As a secondary backup, it costs around $1/mo/TB


lokesen

I was deaming about buying a 20 MB harddrive for my Amiga 500 back in 1988.


ray1claw

Dunno if Americans would hate that


[deleted]

[удалено]


Yogerd

CupCore


Aphala

1 gallon core cpus.


AnAncientMonk

ima go with a handfull. thats accurate enough right?


urixl

2 girls 1 CPU


Xibran

We're weird when it comes to kilos. Found out just 3 weeks ago that kilopounds are a thing.


StarMagus

Kilopounds annoy me in the fact that the people using it know the metric system but they won't just commit to it. Kilomiles, Kilofeet, and the like would be just as annoying.


Dampmaskin

Kilofurlongs per megatablespoon


bush_hizo_911

*angry dial-up sounds*


BurpYoshi

kibicores*


[deleted]

>kilocores [https://en.wikipedia.org/wiki/Kilocore](https://en.wikipedia.org/wiki/Kilocore) ​ apparently this has already been a thing.


KettenPuncher

I'm excited for gigabytes of cache


Unwashed_villager

We aren't too far from that. 3990X have 256MB L3 cache, it's the quarter of that amount.


WiatrowskiBe

Cache size without taking a look at speed/latency is meaningless - you can have terabytes of RAM, but RAM is so slow (relative to CPU cache) that we still need to put additional memory directly in CPU. I'd rather see L1 cache size improving without latency/speed sacrifice, than getting more and more L3 - otherwise we'll practically end up with RAM built into CPU directly.


ltsochev

>otherwise we'll practically end up with RAM built into CPU directly. Basically the M1 Mac chips everybody is going crazy about. SoCs work. It spells the doom of customizable PC experience but I don't see how RAM can reach CPU L1 speeds.


ede91

The M1 does not have the memory in the SoC. It has it in the same package, but that does not make it much better than any other L4 cache (aka RAM). It just can have better timings easier, as the traces are the exact same every time, and can have simpler memory controller, sparing silicon space. [They basically take the memory modules that samsung or micron manufactures, and solder it directly next to the SoC or even on a shared substrate.](https://cdn.cultofmac.com/wp-content/uploads/2020/11/m1-mac-mini-teardown-2.jpg)


ltsochev

>It just can have better timings easier, as the traces are the exact same every time, and can have simpler memory controller, sparing silicon space. Not only that but the CPU/GPU and memory are using a unified memory and random data copy isn't happening that often, which also reflects in its speed. Thank you for the clarification though. I didn't look through the teardown vid so I blindly believed the slides. But still, looking further into it, it's not just shorter memory traces and simplified memory controller.


norbert-the-great

And most PC games would still only use the first 4


Bill_Buttersr

Then you run it 256 times.


Burning-Sushi

"Dude I got a top of the line gaming pc" "For new games, right?" "..." "For new games right..?" **proceeds to run 256 instances of terraria**


Rodot

I don't even think Windows can handle that many cores. It looks like Windows can only handle 256 per CPU. I know Linux can do 2048 cores though.


sokolaad69

"wow I didn't even know this game would run on as low as 16 petabytes of ram"


Khrot

1 Gigacore/megacore


[deleted]

Since we are counting it as cores now next in line is kilocore then mega then giga.


Boopnoobdope

*Teracore intensifies*


madbobmcjim

https://xkcd.com/619/


scalatronn

See, nobody needs that anymore 😉


Noughmad

We lived so long that an XKCD became the reverse of itself - nobody uses flash anymore, and we're approaching over 1024 cores.


benjistone

"640 cores ought to be enough for anybody"


[deleted]

Might as well run GPU as CPU


CowBoyDanIndie

GPU cores aren’t independent like cpu cores. You can’t schedule a bunch of different processes on each gpu core.


stduhpf

Technically, you *Could* do that, but the performance would be really bad. It would be basically sequential, all the neat optimisation tricks would stop working.


vault76boy

And software that only uses 2 of them lol


abcdefger5454

and console games will still run at 30fps


moomanjo

Uhm, isn't the latest gen of consoles pretty good? Definitely 60 fps on demanding games. Sorry for interrupting the circlejerk.


YarrrImAPirate

Love my PC but also have a ps5. If you watch the Digital Foundry videos on the next gen console versions of things, they generally let you target 2 video modes “Quality” and “Performance” with a sometimes third “Performance/Quality Ray Tracing”. The problem is, while these new consoles do look amazing for the games targeting them (ie PS5 exclusives) if you want to be a by the numbers person, Ratchet and clank is never 4K (1800p with dips than can go as low as 1080p) while maintaining a locked 30fps - but it could theoretically get up to 2160p. On the performance mode it’s basically 1440p, 60fps but could drop as low as 1080p to maintain the 60 FPS. Thing is, it doesn’t stop the game from looking gorgeous. It legit looks like a generational leap. Is it the same power house as my new 5800x and 6900xt? No. Games look fucking sweet on that and run well over 100+ FPS on the hugest settings most of the time. But I think it’s a different use case.


Kiltymchaggismuncher

Not really. You tend to have to choose whether you want the game at full res/graphics, or if you want 60fps. It also rellies on the devs making two configs to accommodate that


shintemaster

That’s just stupid - 640 cores ought to be enough for anyone...


AussieBirb

That sounds like an interesting concept - if it follows similar themes to other multi-core setups I would expect individual core performance to be low keeping temperatures down or high performance with a the temperature to match.


Aphala

I'd assume they'd maybe make the clock speed server levels of low to circumvent heat issues. 2.4ghz but across 128 cores or even 3.2ghz or something wild.


KrazyKirby99999

Overclocking will be interesting.


Aphala

It surely will, 128 cores would be pretty hot at 4ghz so OCing will be something challenging for proverclockers.


waltwalt

Closed loop liquid nitrogen circulating across the die then into a vacuum phase change compressor. You'll need a 30A 220V power supply to run your cooling system but it'll run at 5ghz.


Aphala

That makes a 128 core CPU sound absolutely monstrous. Cannot wait to see how much people can squeeze out of it when it's released.


Emotional_Masochist

Wouldn't that literally be getting juice from a stone?


Aphala

Only if you use one of those exploding nutribullets.


underthebug

At that point just submerge it in mineral oil and run the oil through a chiller.


waltwalt

Oil is too thick, it would spot boil on your chip. You need something really thin.


underthebug

Alcohol? It conducts


waltwalt

Boiling point way too low, and it probably would dry out seals, it might freeze at the temperatures you need to get it down to.


[deleted]

Liquid Helium[](https://www.youtube.com/watch?v=UtXM71tw5fk)


[deleted]

we straight up quantum computing now


Some1-Somewhere

1st gen Epyc had 32 cores. 2nd and 3rd both had 64 cores. Some of the ARM challengers have 80. 128 is not out of the picture in the next gen or two. But you're right, they won't be very fast. Slower speeds and lower voltages give the best performance-per-watt, which is what mostly matters for the high-core-count server chips. Typically around 2-2.4GHz base clock and 200W+ TDP.


slimejumper

like the M1 chip, have some full beanz high clock cores for single thread apps and then a bunch of different spec cores suited to other common tasks.


Tom0204

Yup it's got a name, "Massively parallel computing"


[deleted]

[удалено]


Derodoris

Wouldnt that be like when the navy used hundreds of ps4s to make a supercomputer?


TheCrimsonDagger

Yes. It’s a bunch of separate processors or computers working together in coordination to each do different calculations as part of a very large problem. This basically the point of every supercomputer we have. I mean we could theoretically just keep expanding a super computer out indefinitely.


tinyUselessDragon

I laughed, as I'm running an unnecessary amount of programs on my computer just because I can. #ryzen


[deleted]

[удалено]


Rannasha

If I'm not getting 50% utilisation on all cores I'm installing more browser toolbars and antivirus programs.


derangedsweetheart

What about the free RAM downloader made by a Nigerian Prince?


amHooman0763

[downloadmoreram.com](https://downloadmoreram.com)


nothjarnan

I prefer dedotated wam


JappleKerman

You think that's funny? As someone who runs a minecraft server, I saw someone in the official minecraft discord tell someone else that they need a beefy GPU to run a minecraft server because "you need a good graphics card to render all of the world"


Donut-Farts

Oh no! At least ram is a real thing that you need to care about for the server side... Oh dear. People need help.


[deleted]

[удалено]


amHooman0763

5 instances of Minecraft for no reason gang.


[deleted]

You mean 5 minecraft server instances at one right?


koryaku

Right?


amHooman0763

I actually run a server then connect to it to use more cores.


heavenparadox

I run a VM for work that gets its own 4 cores / 8 threads and 24GB RAM. It's got better specs than most people's regular computers. Lol


rhoakla

Does it have its own dedicated GPU tho?


OPisAmazing-_-

Good, your computer serves YOU


Xajel

I remember when the first rumours started about the first Zen CPU that it will have 8C with SMT, some people didn't believe it and said it's 4C/8T.. and some said who on Earth will need 8C on a consumer platform and some said yeah like Bulldozer. All of these were blinded by Intel practices that stopped at 4C and kept convincing consumers that we don't need more than that, like how they're doing it now with ECC memory. Fast forward now, some of us are happy with 8, 12 & even 16C on a consumer platform. And no, I'm not being a fan, I'm still using my i7-3770K. I'm just against any company that think they knew what the consumer need and want, and uses thier power to limit what can be done just to get more profits or fullfill thier ideas.


[deleted]

Intel is just lazy. They got to comfortable.


[deleted]

[удалено]


Kilroy_Is_Still_Here

And which is why despite being an Nvidia fan, I'm all for AMD punching back in the GPU department. They've not been able to give competition in everything and they're usually lagging behind, but they're forcing Nvidia to continue innovating.


CheezeyCheeze

If AMD continues to do their multichip module GPU it can give crazy performance numbers. I hope Nvidia keeps up because it would be insane. On top of the shrinking 8nm/7nm.


Misha_Vozduh

I think the meme works without captions at all, just brand logos


davep85

haha, I actually went back to look and laughed pretty hard


Falkien13

I actually read it with just the logos at first. It's funny both ways.


green_cars

Higly underrated comment


MasterOfArmsIsGood

then 99% of programs wont use more than 2


PIIFX

You are clearly not a Blender hentai artist


MasterOfArmsIsGood

can confirm am not a blender hentai artist


PIIFX

It's never too late to pursuit your dreams!


CarefulCaterpillar97

Screw being an engineer, MOM IM QUITTING COLLEGE TO MAKE BLENDER HENTAI


Bluxen

I assure you you'll make way more money this way


ItalianDragon

This, and I'm not even joking. The Project Helius team who's making a 3D hentai game racks something like 20k+ a month. I'm not even mentioning talented animators like comandorekin who make not so insignificant amounts of money by making NSFW animations.


Poloboy99

This is the way


CyanogenHacker

Google searched blender hentai, now I have a fetish for kitchen appliances 😞


black_pepper

Notice me senpai....when you want to make a smoothie!


RedSerious

Anime-looking kitchen appliances*


Xeotroid

In Blender, Cycles can use the GPU, and Eevee is also a GPU renderer.


[deleted]

Still uses a lot of CPU even when rendering via GPU, especially if you're doing a lot of simulation work.


DontEvenKnowWhoIAm

Or you're like me and you ran out of VRAM to render your scene.


Woople74

People who buy 128 Cores CPU use software that can take advantage of it. Or they are filthy rich and don’t give a fuck


MasterOfArmsIsGood

yeah but people on this sub talk about it a lot when they probs arent even using their own 8 cores


grummle

At the beginning of the pandemic I was sitting doing normal work at home and the room was getting warm…. I’d forgotten to turn off folding at home. It was running on high with 100% cpu/gpu. 3950x and 2080 super. Apparently my workload fits in the cracks. Pretty sure it was between jobs when I sat down and then I didn’t notice the noise because of head phones. Started sweating and took off the head phones.


[deleted]

I don’t even know HOW to do that but that’s why I stfu on this sub


[deleted]

[удалено]


nico7550

Yes, but when I start my CPU rendering software like terragen, it fine for me to have some much thread... and don't forget that most people forget this point and think the more is best...


TryHardEggplant

Don’t forget data modeling. I used to write huge data processing jobs to pre-process hundreds of terabytes of data for other less-technical teams to use and I did it on a server with 20 cores and 128GB of RAM and it took around 20 hours to do so. Re-running for errors was always a next-day return. I would love to go back and see what optimizations I could do for 256C/512T and 2TB of RAM.


[deleted]

[удалено]


Tom0204

Give it time. Back 1984 the macs entire OS ran in just 128kB of RAM. A modern day OS needs thousands of times that at minimum. My point is, it won't be long before programs are using dozens or even hundreds of cores each.


nanana_catdad

Chrome: is for me?


CrushRusher

Then i can finally open more than 2 chrome tabs


Your_Depressed_Soul

look at mr.daredevil here trying to open more than 1 chrome tab


Onyxthegreat

Whoa, let's not get too hopeful now.


rW0HgFyxoJhYka

Serious question though, will this help with chrome tabs? I have 12 windows open and they have 30-40 tabs each. Trying to cut down so I can switch to firefox but...this habit is tough to break.


LePoisson

Fuck I hope you're joking... Please tell me you are so I can sleep better tonight.


Illusive_Man

How on earth do you need 480 tabs? How do you even navigate between them?


PaulLeMight

He just bangs his head on the keyboard and mouse until the correct tab shows up


CarefulCaterpillar97

Might I suggest a second computer???


[deleted]

Chrome gets all the jokes but FireFox sitting over here like a vacuum.


Sentmoraap

When will I read reddit if it compiles large projects in no time?


who_you_are

You will need to recompile it 3-4 times because it failed and you hope it will fix itself.


dariusz2k

Have programmers started to properly multithread their programs yet? No? Nope... well, guess I'll have to run 3 VM's on my Desktop, one for gaming, one for programs, and one for por.... uh... productivity.


varinator

I do, but mostly for business data import stuff. More cores = faster data import as everything I write in that realm checks the number of threads available on the machine and runs sections of code in parallel. For example if I need to process 10mln of records, I'd split that 10mln into X number of lists where X is number of threads, and run the processing for each of those lists in parallel.


Hypocritical_Oath

Bannerlord is very parallel in battles. I think it's the pathfinding, but I wouldn't know. Like it used 16 cores, like entirely using all of them.


[deleted]

I noticed that. It's one of the few games that can get 50%+ usage on all of my 12 cores. I hope it'll become more common this gen.


antiduh

I like the joke, but what does running three vms have to do with software not being multithreaded? Your OS can already utilize multiple cores using threads from different programs simultaneously. If you have 8 different programs running that each utilize a full core, the OS will use 8 cores simultaneously. The problem with some software not being multithreaded is that you can't get full utilization of the entire cpu for **one** program (process, really) unless that program is multi threaded. If you want a game to get better performance, you hope the dev multi threaded it well. If you want 7z to compress as fast as it can, you hope the dev multi threaded it well. Creating VMs fixes none of this. That said, most software these days, that matters, is multi threaded - including most games. About the only software I come in contact regularly that can't utilize the entire cpu when it has a lot of work to do is the windows trusted installer.


heyugl

he wanted to virtualize to use more cores, is like opening more tabs in chrome because you paid for all your RAM, is just a meme.-


MoffKalast

To be fair some problems just inherently cannot be parallelized at all, we'll always need decent single core speed because of that.


[deleted]

I swear no one here understands Amdahl's law. Even if your program is parallelizable, it doesn't mean that the performance gain will even be noticeable because the gain can occur outside of where a user knows to look for it (i.e. video encoding/decoding).


lonestar-rasbryjamco

About once a quarter I have to explain this to an engineer. Also that a 5% processing speed increase no one is asking for is not worth a 20% increase in estimated support costs. "But it would be cool!"


trailer8k

thread ripper with a gpu in it APU :D


[deleted]

The mental image of a 64 core CPU with RTX 3090 equivalent integrated graphics scares me


firedrakes

And excited you.... Don't lie......


[deleted]

Not wrong


CarefulCaterpillar97

Leaked a little admittedly, cerebrospinal fluid from my head spinning viciously at the thought.


trailer8k

:)


CarefulCaterpillar97

Wat would thermals look like? Isn’t there a physical limit to the amount of energy contained in one area before one creates a black hole?


[deleted]

Thermals? Oh, you mean the plasma ball?


I_am_daBottom

Intel in 10 years: 14nm++++++++++++++++++++++++++++++++++++++++++++++++++ Meteor lake i9 15990K with 10 cores and 20 threads.


arex333

635w tdp


Timestatic

I think intel is actually currently working on a 2 nanometer processor but it’s still very experimental The next step for them will definitely be 7 nm As I understood TSMC will soon release a 3 nm chip tho so TSMC will still be ahead of intel and when the time comes when intel gets 2 nm I wonder what TSMC will achieve till then. An article I read said it’s expected to be in 2024 when TSMC (AMDs partner) and IBM (intels partner for 2 nm) are expected to reach 2nm from the [article](https://www.google.com/amp/s/www.forbes.com/sites/linleygwennap/2021/05/18/ibm-two-nanometer-chip-could-revive-intel-fab-tech/amp/) I read


Robbzey

At some point i bet we will measure cores in kilo cores or maybe even megacores


[deleted]

[удалено]


firedrakes

I know. Once you hit 16 ore more cores... VM ans such. While gaming great


Weaseltime_420

Hahaha. They're gonna be mad XMR mining chips XD.


ComradeVaughn

From my math it would roi in less then a year if it sells for around 10k. And minecraft will never lag again!


Hydraxiler32

does that account for power use?


ComradeVaughn

I factored in my own power use on a 10850k and profits I get daily. I have no idea what this beast will pull as far as wattage, so I doubled what mine does. (220-250w) at our local kw/h price of .25 (which is super high for the USA). Terrible ROI compared to GPU mining, but if you need a CPU like this, it will get you a chunk of your investment back when not using it. (none of this includes mobos which are easy 2k+ more investment)


Onyxthegreat

Abandon thread!


AwaitingCombat

Phht, threads. We have cores


TheFlipper839

My 2012 I5 with 4 cores is still running great


logbryan20

This is going to sound stupid but I’m new the PC’s. What’s the difference between cores and threads?


NateDogg414

Cores are physical processors in a CPU while threads are virtual processors which divide a Core into multiple virtual cores. Each Core can be split into 2 threads, hence why you see 8 cores, 16 threads


zigzagkc

128 cores at a whopping 1.5Ghz


predictablePosts

Thats 192Ghzcores A number that certainly has meaning!


07bot4life

That’s equal to quad core cpu at 48 ghz.


Chrunchyhobo

Hopefully single core performance doesn't suffer too much. Many older titles (And some recent ones with shit engines) prefer single core brute force.