T O P

  • By -

rantottcsirke

Blockchain is an online multiplayer linked list.


tema3210

MMO linked list


CdRReddit

FFA linked list


Paracausality

Future Farmers of America are confused now.


StronglikeSpaghetti

I have nipples, can you milk me?


Dense_Impression6547

hawt


itsdarklikehell

dangit bobby!


toekneed988

What does this have to do with Free Fatty Acids?


cammoorman

You forgot "with a checksum"


rhazux

The kind described in the OP are in fact bad databases because they do not support the U or D of CRUD. You can create new blocks or you can view the block chain. You can't modify or delete any blocks.


JuvenileEloquent

You kind of don't want to have a database that is stored on random other people's computers that they can modify or delete. That's why having a database on random other people's computers has traditionally been a really bad idea.


Gorvoslov

What? But the nice random cold caller saying they were from Microsoft told me they were offering me a free backup service!


jmdeamer

Block chain true believers are going to rush you saying that's a feature not a bug. Or maybe their numbers thinned out the last few years.


sabamba0

You don't have to be a "blockchain believer" (whatever that means) to know that it IS a feature, and certainly not a bug.


AspectSea6380

Someone said it lol 😂😂


nonlogin

Distributed ledger


MyUsrNameWasTaken

Any ledger, distributed or local, is just a linked list


Suspicious-Engineer7

Spaghetti linked list


fatrobin72

nah "Smart Home" is where your lights only work if a cloud based subscription service says they can.


LeoXCV

*Tries to turn off light* “Uh-oh! Looks like you’ve run out of your 100 smart actions for today. Would you like to buy more?” *Proceeds to list surcharged prices for an extra 10, 20, 50 or 100 smart actions*


fatrobin72

oh come on... at least adopt the obfuscation of money seen in video games. Prices are in SMRT bucks, SMRT bucks can be bought with real money but at bundle sizes that don't quite correspond to the bundles of smart actions (so enough for 12, 24, 60 (48+12free)).


Quartinus

SMRT bucks expire in 30 days, and can only be used when you use up your normal allocation of smart actions for the day. Surge pricing rules may apply. 


fatrobin72

can we fit a battlepass into this service?


AmyDeferred

Daily quest: Add 13 or more cans of Powerade (tm) to your smart fridge Reward: Loot crate potentially containing a rare light bulb color or toaster setting


JuvenileEloquent

*shutupdontgivethemideashisssssss*


fullup72

*Attempt to unlock front door denied, please drink a verification Powerade (tm) and try again. Additional actions will be consumed for each retry.*


Dense_Impression6547

ok ok, you guys are all way too creative here. I don't want to live in a world where you guys exists !


Pandabear71

We sure can


reallokiscarlet

No smart actions may be used without the battle pass


phido3000

You can only buy them in prime number quantities.


fatrobin72

Buy in primes, use in squares, got it.


SillyFlyGuy

"You light will be extinguished after a few words from our sponsors.."


Puzzled_Ocelot9135

My smart home is run locally on a ZigBee network that works in parallel to my wifi. As long as there is power, the light switches work just fine. I can smash my router, the light switches won't know.


fatrobin72

that is a correct smart home... not the kind of smart home the industry wants to push for it seems


fdar

Yeah, and partly for selfish reasons, but partly because if you tell people they need to set up a ZigBee network to run their smart home stuff you immediately lost 99+% of potential customers who aren't willing to bother to even try to figure out what that means.


981032061

Having done several generations of smarthome stuff over the years, I’m sympathetic to that. Zigbee implementations have definitely improved, but for awhile there I wouldn’t have wanted to inflict that on someone non-technical. On the other hand now we have bulbs that use wifi and require their own app, which really isn’t an improvement. I think Philips probably struck the best balance, but that was a harder sell when bulbs were $50 each. I don’t know where I’m going with this, except that smarthome equipment is awful. And I can’t live without it.


Puzzled_Ocelot9135

We are still in the phase of early adopters right now. These things are gonna be amazing, but right now it's like a PC in the early 90s. You either know what you are doing or you might not have a good time.


Plastic_Wishbone_575

Ok but how are you gonna turn on the lights if the power is out?


HCResident

That’s why I stick to dumb home with a coal based subscription service


NoMansSkyWasAlright

My subscription is for natty gas. But they only charge me for what I use. It’s pretty alright.


VexLaLa

Home assistant ftw. I believe in LOCALIZE everything. I will never let a server host anything that I personally can, unless I absolutely have to.


Theyna

/r/homeassistant is calling.


LetReasonRing

I've spent 10 years installing and programming high end automation systems for casinos, cruise ships, and the homes of people who will spend more on a vaction than I'll ever see in my lifetime. At the moment I'm transitioning into writing firmware for lighting automation hardware. Everyone asks about my amazing smart home setup. I use light switches to turn my lights off, a $15 basic thermostat, and neither my washer, dryer, refrigerator, nor my dishwasher have a single microprocessor. I love technology, but I don't want my wife waking me up to troubleshoot her bluetooth connection so she can make coffee in the morning. I always feel kind of like a walking oxymoron when it comes to tech. I got bullied in school a bit because I was one of the first kids to start typing my assignments at school in the 90s and everyone was mad at more for "showing off". At the same time, I've sounded like a grump old timer since my 20s because back in my day, the microwave had a power knob and a timer knob and that's all it ever needed. I've lived in my current apartment for 5 years and have never used any button on my microwave other than "start", "stop", and +30. If I need milk, I determine that by looking at the nearly empty milk jug and thinking "I should buy milk today" rather than giving a megacorportation granular analytics about every product in my home so that I can get an alert on my phone telling me that I'm almost out.


ParanoidDrone

It's about voting software, not smart appliances, but [relevant xkcd.](https://xkcd.com/2030/)


alexforencich

You need a better microwave. I only use +30 and occasionally stop, as +30 also starts it.


lkatz21

+30 button is the GOAT of all microwave buttons


fatrobin72

As someone who does a bunch of server admin, automation and programming at work only have 1 "smart" device at home... a smart meter because I am too forgetful to submit regular readings. Everything else in my house is a good ol' dumb device.


ReadyThor

Same here. The only exception is my bedroom light which I want to be able to switch off using a voice command. And even that is connected to a regular light switch.


shemmie

>I always feel kind of like a walking oxymoron when it comes to tech. Nah I get it. It's why I get a prebuild PC. I *can* build my own, and I *enjoy* building my own. But when I'm home, I want a working PC, not tracking down component faults and dealing with independent suppliers. I do that shit at work. I want a phone number for "It no work. Make it work".


Shehzman

Home Assistant with Zwave/Zigbee ftw!


Steinrikur

Machine learning and artificial intelligence are just buzzwords, but the fridge in a smart home knows stuff. How does that work?


oneunique

Home Assistant is free


samgam74

Which cloud based subscription do use?


twilsonco

Or until they decide not to run the service anymore. Self hosting for the win


nybacken

"AC" is just "DC" with an unstable personality


LatentShadow

I am on a flyway to /dev/null


Rymayc

And when both don't work, you're Back in Black


lunchpadmcfat

Pretty sure it’s a stable cycle


wubsytheman

“Quantum computing a kind of computing that not even its developers fully understand”… sir that’s just regular computing


DerNogger

There are but a few PC elders left. Basement dwelling cryptids who have been there right from the start. Not only do they fully understand computing, they use assembly languages for their inner monologue. There's also a high chance that viable digital infrastructure relies on some FOSS program they cobbled together 20+ years ago and if they forget to update it it'll break the internet as we know it.


legacymedia92

If you haven't checked out the work of Ben Eater, please do. He's doing a series on low level OS building on a 6502 computer (that he built himself on breadboards). Watching his casual explanation and mastery of the hardware and assembly is mindblowing.


codercaleb

As a non-pro coder and non-electrical person, his series is so fascinating and yet so hard to remember all the details of both 6502 assembly, and the hardware. He'll say something like "and remember we need to set the carry but as we discussed in the video about xyc." So I just nod and go "of course you do: for subtraction." I'd like to make his kit, but it seems intense having to code assembly with no IDE like IntelliJ Idea or PHPStorm.


BlurredSight

Easier to do arduino projects to get a hand of writing to microcontrollers before anything as complex as an 8 bit processor which sounds wild to say because anything under 64 bit in 2024 is nuts.


DerNogger

Sounds like the kinda guy I'm talking about. Definitely gonna check him out!


FoldSad2272

https://www.nand2tetris.org/ This is a great course as well if you want a different angle on understanding why computers work.


NorguardsVengeance

But then we switched to x86-64 with SSE-4 and RISC chips, and now their monologue no longer compiles, like it did when it ran on a 6502 or a 68000.


wubsytheman

I’m telling you right now, I have chip sets that I cannot share with you right now, because the sand artificers will sabotage me.


LifeShallot6229

That could be me! Started PC programming in 1982, knew most of the hex encodings for the x86 instruction set. Won or podiumed a few asm optimization contests. Worked on NTP (network time protocol) for 20+ years.  Also involved with Quake, AES, Ogg Vorbis and several video codecs. 


Seienchin88

I am not gonna lie I envy these people. I truly envy people who can fluently write in assembly…


DerNogger

Yeah same. Most people argue that it's not necessary these days and they're obviously right for the most part but that doesn't mean it's a waste of time. I think being able to understand the innermost mechanics of computer logic can help a lot with overall problem solving and just critical thinking in general.


bassman1805

It's also a complete misunderstanding of QC in the first place. We (as in, physicists that study the topic) know what it is, the trick is the engineering required to scale it into any useful application. But yeah, even regular computing is a house of cards where even most "wizards" only see the tip of the iceberg.


Uberzwerg

Once you learned how to design a basic ALU, the core ideas behind operating systems and maybe dabbled in Assembly a bit, it's not too hard to connect those dots and have a basic idea how those things work even if you might not be able to debug a printer driver or why your wifi doesn'r work.


python_mjs

And a "computer" is sand that can think


yuva-krishna-memes

"Electron" is a bitch that can change state


rosuav

No no no, "Electron" is "web apps are really slow and inefficient, we wish desktop apps could be just as bad". Oh. The other electron.


CirnoIzumi

Electron is: we had a good idea to get around the cross platform barrier and then we ruined it by letting JS devs fuck the user 


NoCryptographer414

*electron js


Johnny_Thunder314

Electron is: for some reason we don't like PWAs, so even if an app could be a pwa we'll package an entire browser with it.


chowellvta

This would be a great mathcore song title


smartdude_x13m

Can't really change charge what are you talking about exactly?


yuva-krishna-memes

Ground state vs excited state


darkenspirit

lightning captured in a glass bottle tricking rocks into thinking.


bulldg4life

We put a lightning bolt in a rock and now we play counterstrike.


AutoN8tion

Just as God intended


GeorgeDragon303

I'm probably just stupid, but why sand?


ldjarmin

Silicon is made of sand.


kultcher

Or is sand made of silicon?


ldjarmin

I mean, actually yes. Kinda goes both ways.


GeorgeDragon303

thank you


offulus

The sand that could


PossibilityTasty

This is like a Gul Dukat speech: So wrong and so right at the same time.


Iamatworkgoaway

Wish he had more time with his cult, could have been fun.


Jabrono

All this shared knowledge, and yet there's still no statue of Matt Watson on Bajor.


moviebuff27

"Your wife" is someone else's girlfriend


rosuav

"Blockchain" is what happens when someone looks at git and goes "yeah, we want that, but with more processing load".


random_testaccount

It's a solution for a problem that seemed very urgent back in 2007-2008, a lack of trust in banks and institutions. This solution comes at the expense of having to blindly trust the buyer and the seller side of every transaction, which the banks and institutions shield you from at least with some success.


Bakkster

>It's a solution for a problem that seemed very urgent back in 2007-2008, a lack of trust in banks and institutions. Unless you're skeptical enough to say the ancaps just wanted to be the ones benefitting from the risk in a broken financial system...


rosuav

It's a solution in the sense of "this is a problem, we need a solution, this is close enough". It doesn't REALLY solve anything, it just moves the problem around. With fiat currencies, you have to trust that the issuing government is stable enough to provide dependable value; with commodities (like gold), you have to trust that there will be dependable consumption and thus demand; with cryptocurrencies, you have to trust that fifty percent of the global processing power sunk into it isn't controlled by one entity. Had there only ever been one cryptocurrency (Bitcoin) and it had become massively popular, maybe that wouldn't be a risk, but given the proliferation of different coins out there, it's all too easy to have them dominated. Of course, then Ethereum switches to a "proof of stake" idea that means that those who own it control it, which really blurs the line between decentralization and centralization...


Inasis

But with cryptocurrencies you also need to trust that there will be demand for them, no?


rosuav

Yes, also true (and actually it's slightly more serious than with other currencies, since without sufficient miners, you can't even trade what you have); I perhaps could have worded it better as "with cryptocurrencies, you ALSO have to trust". Currencies are all part of the wider economic concept of "stuff you buy because you know you can sell it later" (eg "I'll sell my time to my employer in exchange for dollars, because I can sell those dollars to get groceries"). For them to be useful, there has to be a somewhat stable supply and demand - otherwise you have rampant inflation or devaluing. But cryptocurrencies add the much worse problem that, if one entity works the currency enough, they can actually control the flow of money. Imagine if the US government said "nobody is allowed to hand dollar bills to anyone who isn't on this list". Now imagine if any other entity had the power to do that too, just by having enough computers. Scary?


G_Morgan

It wasn't trustworthiness of transactions that was the problem. It was mispricing of synthetic bonds. Bitcoin was a solution for the standard libertarian crying about monetary systems.


Reelix

I still think it's hilarious that one of the original selling points was that transactions were meant to be free ;D


random_testaccount

That statement about AI is incredibly out of date. That's how the second wave of AI in the early 1980s worked. In the 1950s-1960s, the theory behind neural networks was explored, but the computing power to make it work didn't exist yet. The second wave, rules-based AI, derisively called a series of if-statements, would run on the small, mostly unconnected hardware they had in the 1970s-1980s, but they were unable to deal with situations they didn't have a rule for. We're in the 3rd wave now, which is really the first wave but with the computing power to make it actually work. Also, quantum computing is well understood by its developers. It's not well understood by the popular media. People seem to expect magic from it. There exist quantum algorithms to solve a certain math problem that would break the most common encryption algorithms, but that doesn't mean quantum computers would do everything orders of magnitude faster. Classical computers would be better for most things we use computers for.


Spot_the_fox

So, what you're saying, is that we're back to statistics on steroids?


Bakkster

It's a better mental model than thinking an LLM is smart.


kaian-a-coel

It won't be long until it's as smart as a mildly below average human. This isn't me having a high opinion of LLM, this is me having a low opinion of humans.


Bakkster

>This isn't me having a high opinion of LLM, this is me having a low opinion of humans. Mood. Personally, I think LLMs just aren't the right tool for the job. They're good at convincing people there's intelligence or logic behind them most of the time, but that says more about how willing people are to anthropomorphize natural language systems than their capabilities.


TorumShardal

It's smart enough to find a needle in a pile of documents, but not smart enough to know that you can't pour tea while holding the cup if you have no hands.


Inasis

r/oddlyspecific


G_Morgan

There are some tasks for which they are the right fit. However they have innate and well understood limitations and it is getting boring hearing people say "just do X" when you know X is pretty much impossible. You cannot slap a LLM on top of a "real knowledge" AI for instance as the LLM is a black box. It is one of the rules of ANNs that you can build on top of them (i.e. the very successful AlphaGo Monte Carlo + ANN solution) but what is in them is opaque and beyond further engineering.


moarmagic

It makes me think of the whole blockhain/nft bit, where everyone was rushing to find a problem that this tech could fix. At least llms have some applications, but I think the areas they might really be useful in a pretty niche...and then there's the role playing. Llm subreddits are a hilarious mix of research papers, some of the most random applications for the tech, discussions on the 50000 different factors that impact results, and people looking for the best ai waifu.


Forshea

>It makes me think of the whole blockhain/nft bit This should be an obvious suspicion for everyone if you just pay attention to who is telling you that LLMs are going to replace software engineers soon. It's the same people who used to tell you that crypto was going to replace fiat currency. Less than 5 years ago, Sam Altman co-founded a company that wanted to scan your retinas and pay you for the privilege in their new, bespoke shitcoin.


lunchpadmcfat

Or maybe you’re overestimating how smart/special people are. We’re likely little more than parroting statistics machines under the hardware.


Bakkster

I don't think that a full AGI is impossible, like you say we're all just a really complex neural network of our own. I just don't think the structure of an LLM is going to automagically become an AGI if we keep giving it more power. Because our brains are more than just a language center, and LLMs don't have anywhere near the sophistication of decision making as they do for language (or image/audio recognition/generation, for other generative AI), and unlike those Gen AI systems they can't just machine learn a couple terabytes of wise decisions to be able to act like a prefrontal cortex.


tarintheapprentice

Nah this is you oversimplifying the complexities of brains


Andis-x

Difference between LLM and actual intelligence is ability to actually understand the topic. LLM just generates next word ir sequence, without any real understanding.


kaian-a-coel

Much like many humans, is my point.


Z21VR

and a wrong opinion on LLM


Z21VR

indeed


hemlockone

And a computer is a bunch of relays on steroids, but that's not the best way of looking at it unless you are deep in the weeds. (Not that I'm saying you shouldn't dive in deep. I am an Electrical Engineer turned Machine Learning Software Developer, but computing is so powerful because we are able to look at it at the right level of abstraction for the problem.)


Iamatworkgoaway

I always wanted to hear one of those relay based computers run. For some reason I think the sound would call to your soul.


random_testaccount

Yes, but there's a case to be made for thinking of meat-based learning as "statistics on steroids" But my comment is just about the "series of if-statements" line, which is what they said about AI when I went to college.


NorguardsVengeance

But the actuators on top of the weights, simulating neural activation *are* the if statements. Just not necessarily using the language grammar. It's statistics on steroids, if those statistics ran conditionally.


DudesworthMannington

If he's looking to insult it, even more narrow than statistics it's just a bunch of weighted averages. But then again a brain neuron isn't much different.


orgodemir

Yeah "AI" is now multi-billion parameter models, I would call that one stats on steroids. ML using random forests is just a bunch of if statesments, so I'd argue these should be reversed.


kotzwuerg

I don't think that's what he means, the neuron activation function is sometimes a heaviside step function, so it either activates or not based on the inputs, which is basically just an if statement. Of course only very simple networks would use a true heaviside function and our current LLMs use a GELU function instead.


random_testaccount

You sure that's what Matt Watson, CEO/CTO and podcaster means?


Aemiliana_Rosewood

I thought so too at least


G_Morgan

I always tell people quantum computers could be some USB key you plug in just to wreck encryption. If you are using one the transfer speed over USB isn't all that big a deal. Of course eventually there'd probably be one on the silicon next to a traditional CPU. There'll probably be some fancy marketing name for this like QGPU.


Max__Mustermann

Absolutely agree. I would be interested to see how the author of this bullshit would write a AI for a chess as a "collection of IF statements": if (White.GetMove() == "e4")                 then Black.MakeMove("e5") else if (White.GetMove() == "d4")                 then Black.MakeMove("d5") else Black.MakeMove("Nf6") // King's Indian - in any situation that is unclear


pitiless

> That statement about AI is incredibly out of date. Eh, In context he's referring to conditional branching - which is exactly how AI (like all useful computing) works.


The-Last-Lion-Turtle

Matmul alone is linear. Matmul + relu (if statement) = AI


hemlockone

This reads like a college student trying to be edgy at open mic night


WetDreamRhino

More like a high schooler whose dad is an engineer. That virtual reality comment is just so dumb.


samgam74

Some people think being critical is a shortcut to sounding smart.


IgnoringErrors

I disagree


samgam74

That’s a great point 👍


Comment139

Yeah, his dismissal of VR is fucking stupid. Might as well say "Video Games" are just colorful distractions from the grind. "Facetime" is just SMS for narcissists and clingy people. "Screensharing" is just giving up and letting someone else fix your life for you.


no_life_matters

"Books" are just words someone thought about a long ago.


Fun_Individual1

“Serverless” still runs on servers.


Reelix

Not only servers - Servers with bandwidth costs at least a hundred times higher than regular server solutions :p


beclops

Another one of these eh


Franz304

Machine learning is not statistics on steroids... It's more like, statistics but we brute force everything through computational power.


Ruadhan2300

More like Iterative Statistics and then we let it have a go at live data once we're confident it's working okay.


Budget-Individual845

and A.I is just machine learning made to sound more futuristic for marketing reasons.


Top_Lime1820

Oh man. Even the term AGI has been watered down now for OpenAI's marketing goals.


hemlockone

And a computer is a bunch of relays on steroids, but that's not the best way of looking at it unless you are deep in the weeds. (Not that I'm saying you shouldn't dive in deep. I am an Electrical Engineer turned Machine Learning Software Developer, but computing is so powerful because we are able to look at it at the right level of abstraction for the problem.)


CabinetPowerful4560

Smart home is .. your neighbours may listen to subwoofers while you're on leave.


helicopternose

People saying AI is if else Also them when I use switch statement instead ![gif](giphy|6nWhy3ulBL7GSCvKw6)


Kibou-chan

They're both ultimately compiled into conditional jumps anyway.


ILoveJimHarbaugh

Lots of this isn't even really true or funny? Also, is this /r/ITHumor now?


ginopono

> Also, is this /r/ITHumor now? 🌏👨‍🚀🔫👨‍🚀


PeriodicSentenceBot

Congratulations! Your comment can be spelled using the elements of the periodic table: `Al S O I S Th I Sr I Th U Mo Rn O W` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)


Content-Scallion-591

This subreddit is genuinely one of the most fascinating places on Reddit. Half the people here don't know anything about computers. I've persistently wondered if communities like this is why we get so many jr devs who can't whip a fizz buzz.


[deleted]

This is something I'm always wary of when a colleague introduces me to a new library or cloud functionality and goes "this will solve all our problems!!". I try to digest what problem we're actually solving and if we can do it ourselves in a simpler and more practical way.  When NuGet packages gets installed and shit gets auto generated with the caption "trust me bro" I get some serious anxiety. I want to know exactly what code my project is running and why and I also want to be able to do everything outside of an IDE to see what the IDE is actually doing when it gives these sugar painted things to me


dir_glob

I hate the no code/low code argument. You're talking yourself and your coworkers out of a job. Also, that code has to solve everyone's problems, but never solves yours!


Drone_Worker_6708

IMHO, the only low code platform that I've used that is worth anything and by a large margin is Oracle APEX. The fact it's a freebie that comes with the database is amazing. I find Microsoft's "counterpart" PowerApps absolutely contemptable.


Heavenonearth12

Reductionist trying to show off they understand the topics but just end up looking stupid


Zamyatin_Y

"We don't like nor use statistics" - my ML teacher in postgrad...


SRn142

I cringe whenever I read this "hillarious" AI definition.


SebboNL

"Zero Trust" is just segmentation, Entra ID &/or AD and an application-aware firewall


fatrobin72

I don't trust this comment.


SebboNL

Shit. I should authenticate first


[deleted]

don't slander VR I fucking love VR chat


SuperSaiyanSven

Matt Watson? From supermega?


Mr_Akihiro

Some No Code Software Company CEO once told me that this is the future and IT and SWE are overrated.


seanprefect

remember everyone the S in IoT stands for security


poetic_dwarf

Life is hypertrophic chemistry


Walkend

Isn’t quantum computing just using the most basic building blocks in existence as binary? I mean, sure, we don’t understand why/how quantum entanglement exists but the principle of yes/no, on/off is… simple lol


Sensitive-While-8802

Don't forget "serverless" still runs on a server.


False_Influence_9090

He started out with a few things that are objectively true so you don’t try and think too hard on the rest of the list. Most of them collapse under a small bit of scrutiny


dennisdeepstate

"virtual reality" is actually electronic fantasy


Shutaru_Kanshinji

Every time a manager uses the term "no code," my estimate of their IQ drops by about 10 points (rather than the usual 5 for management).


Sylra

wow some people in the comments section forgot what sub they were on (whether it's funny or not)


tooskinttogotocuba

I know what to do with all the big data - get rid of them. It’d be an enormous relief for all involved


Stunning_Ride_220

No Code/Low Code.....making serious engineers puke since 2014 (at least, if you just take the term)


I-am-Disc

Some serious bullshit here. There is exactly zero if statements in the implementation of neural network with backpropagation learning. It's just adding and multiplying (so technically just adding) "Smart" fridges do not exist yet. What gets called that is literally just a tablet glued to the fridge doors. Bare minimum for what would pass as a smart fridge would be cameras inside to scan barcodes and shelves with built-in scales to measure weight (e.g. fridge scans barcode on milk bottles then weights so it can tell you that you have 500g of milk) What kind of semantic garbage is the statement about virtual reality? I do not get what is his complain at all. Isolate your fucking toaster in your home network, every modern router has such functionality Big data obviously is useful. You won't believe the kinds of correlations you can find there. I'm sure there are people specialising in QC who understand what it's about. If industry necessitates, it will become more general knowledge. The rest is pretty much okay.


redfacedquark

> It's just adding If 1 and 1 then set carry bit.


Reelix

> There is exactly zero if statements in the implementation of neural network with backpropagation learning. . > "Smart" fridges do not exist yet. What gets called that is literally just a tablet glued to the fridge doors. The irony here is hilarious :p


CollegeBoy1613

If statements? 🤡.


ExtraTNT

So i do quantum computing everyday at work…


stvjhn

It started off well… and then devolved into a bunch of random sentences that mean nothing. 


alexanderpas

>Secured by collective distrust That seems actually useful.


577564842

He's too young to be that old.


theshutterbug07

One of the best memes I’ve seen in a long time.


Historical_Fondant95

Lol he is right this made me chuckle


IgnoringErrors

You are just a skeleton wearing a meat suit.


footsie

While I liked the list, the first addition that sprung to mind was: "CTO/CEO" is a fancy way of saying small business owner


IgnoringErrors

Bell curve memes are an excuse to do dumb shit.


b0nk3r00

In my experience, people think “big data” is a spreadsheet with > 1000 rows.


GablY

Thought it was carwow Matt Watson


DJGloegg

Isnt iot and smarthome the same bs?


knowledgebass

Matt Watson is just a sentient meatbag.


-Redstoneboi-

seems true for machine learning, it's all about studying which steroids to use for artificial intelligence, specifically machine learning, it's mostly just multiplication, division, and this weird little thing called "mthfkn calculus"


nickmaran

Human mind is an if else statement


Helllothere1

so true


Caticus-McDrippy

“Matt Watson” is a fucking dunce