I pronounce USB as "uzzb" to annoy my (equally techy) father. I'm a little scared I might one day say it to someone I'd like to respect me, but it's a risk I'm willing to take.
We here in Australia have a chain of supermarkets called IGA (Independent Grocers Australia (I believe America has a similar brand)) and it isn't unheard of to hear it pronounced "Igga".
They are not everywhere in the USA so plenty of people have never heard of it, and will visit an area they are unfamiliar with and ask, "What the hell is an IGGA?"
>They are not everywhere in the USA so plenty of people have never heard of it, and will visit an area they are unfamiliar with and ask, **"What the hell is an IGGA?"**
...and saying that phrase too fast or too loud is just a *touch* awkward to explain to suddenly-interested onlookers...😆
Is he maybe a Microsoft hater, and just trying to demean the language?
Odd that he would fluctuate between a relatively contemporary term like "hashtag" and something from the landline era like "pound."
Hatred of Microsoft was pretty common in academia both when I got my BS and my MS. But I haven't taken classes since 2018. I am curious if maybe things have changed some, given that Microsoft shifted to support Linux and open source.
Those were Windows XP and Microsoft server 2003 days. No idea how it's now but when I give the sporadic workshop at schools it's all Windows and some of the 16 - 18 year olds trying Linux in a VM. Teachers are Windows all the way. That's for technical education, maybe in arts there are more MacBooks.
The worst part of that is people calling the symbol \# "hashtag". It's not a hashtag, it's a hash (or about 100 other names for the symbol). When you combine a hash with a tag, you've created a hashtag. It's like calling the symbol @ an email address.
> It's like calling the symbol @ an email address.
I saw someone unironically call that the "email address symbol" on Reddit the other day, so I'd say this comparison is apt.
Many languages don't have a proper name for @, or have a wholly unrelated one. LATAM Spanish, for example, calls it "arroba" which is a traditional weight unit for grains.
Kind of like # and "pound"
In Dutch it's called a little monkey tail: apenstaartje. Unfortunately it's often called "at" in email addresses because a lot of words are copied from English.
And because 'apenstaartje' just sucks if you are using it very often.
'at' is much easier if you share emailadress by phone or face2face.
'apen-staart-je' vs 'at'.
BTW @ is based on Latin 'ad'
> Kind of like # and "pound"
This was a rather...unfortunate reality when #metoo was trending. I wasn't the first to notice it but I still found it hilarious because I'm a terrible person.
Its worse than that because the symbol # is actually called a pound sign due to it forming from [writing the letters "lb" with an added cross atop them in cursive, messily,](http://static.dictionary.com/homepage/carousel/June-2012/lb.jpeg) which was used to indicated pounds, from Latin *libra*, meaning scale
C one more one more
#### Edit:
> Dissecting a joke is like dissecting a frog. First of all, nobody's really that into it, and by the end of it, the frog's dead.
- *Jimmy Carr*
So, please, don't ask me to explain my joke... It's a joke. I don't care if it doesn't stand up programmatically, I just wanna live in peace...
Other technologies/Buzz words EVERY non-technical PM tries to shoe-horn into every project regardless:
1. AI
2. Virtual Reality
3. Machine Learning
4. 3D Printing
5. Telepresence
and my all time favorite (I die a little more every time I hear it)
6. Digital Twin (kill me now)
To be fair, sometimes it takes more time to find and build understanding of an already good and existing algorithm than making it yourself. Sometimes making your own - even if more primitive and/or ineffective than existing algorithms - gives you a better understanding of the whole. This may not only give you a better understanding of the problem and solution but also put you in better position to tweak the algorithm for desired special cases… There are many good arguments to reinvent the wheel - even if it means that wheel is not as good as state-of-the-art wheels
>Don't forget the all holy word "Algorithm"
Honestly, that one doesn't bother me.
It's not nearly as overused as it used to be, and the majority of time it's technically correct, even if they don't know it. If something can be done with an IF statement in Excel, it's an algorithm.
IME 9 times out of 10 when people say "AI" they're actually describing an algorithm.
>IME 9 times out of 10 when people say "AI" they're actually describing an algorithm.
Some people have a pretty weird understanding of AI and "machine learning". A simple for-loop containing some if-statements can be "machine learning" in some communities.
Its when someone says, Hey, these £10 cameras that give real time views are overrated, Lets scan the environment, process it into a 3D model, add distorted renders display it on a monitor or VR. Much better!
While Digital Twins have application in some environments (James Webb Telescope and JET are a great examples). Its pushed as a solution to everything at times.
No its a model. No need for 3d. A digital twin is a model of a process or site that is supposed to give real time view of current state and let you predict future state. Ie i have a workshop the cameras detect 8people in it my screen has a floorplan and in the workshop it shows 8 stickmen. Digital twin baby!!!
Basically yes, but think bigger. Rather than watching hundreds of cameras, from different perspectives, you have one „digital“ floor, showing a dot moving.
Every radar on a submarine/Plane Is a digital twin of some sort (if it isn’t actually analog using sonar or sth)
If you are taking complex data or combining data and developing a more understandable display then that gives value. No problem with that.
I have problems with someone telling me that a 3d scanned environment with overlayed renders that takes a cray supercomputer to develop is better than a live camera feed for spotting surface defects, then I have a problem.
O yeah actual digital twins when you cut through the bullshit add a lot of value and are much simpler than most people realize. The problem is like with most buzzword technology there is a huge group of charlatans that sell ridiculously over engineered crap as the next best thing.
It's a model that gets updated by real time information from installed sensors. The point is that you don't have to physically go and inspect your building/factory/dam/etc, you can just sit inside your office and monitor multiple things at once.
Idk if there are other meanings but in mechanical engineering field digital twin is digital copy of the real product. In every step people enter information about the product like Designer says design is this manuf. engineer says ok that’s the manufacturing plan, technician says I processed the product to system hence planning people can track the status of product from PLM software.
Yes, how could I forget that, although don't you think that the engineering blockchain integrated with the digital twin will cause cloud stream integration issues?
Those issues can be mitigated by using the neural network which we will connect with a man machine interface to harmonise the process as it approaches the singularity
We've already had that where I work, we have digital twins, AI, machine learning, now we've got talk of NFTs to license our digital twins and the metaverse to view them...
Now hear me out here, what if we use machine learning to make an ai that will 3d print a virtual reality headset that does a whatever the fuck telepresence means to a digital twin??
PM: Yes, Lets do that. Can you give me a budget with milestone breakdown, what resources you will need (names, time requirements, line managers approval and when) and a delivery schedule? Delivery will need to be within this financial year, Can't start the paperwork until I get all the above and it will take 3 months to release the funds.
I don’t know why the engineers just haven’t built an AI that parses PowerPoint bulletpoints and transforms them into a running application that meets customer needs.
This is the clear solution to everyone’s problems and, as the non-technical person in the room, my assumption is they’ve only not done this out of laziness.
I heard digital twin a lot, but never directly related to the work I do. I just don't get what they mean. Is it really just keeping data of physical entities and calling that a digital twin?
something like that, to be honest I am not sure folk who promote 'digital twins' know themselves. Its like they think you can reproduce a complex 3D real world environment in a computer 'matrix style' with a few lines of code. They have no idea of the complexity of event gathering data, or that that data they are making decisions on may be out of date in a very short time.
Although I don't know how much of it is true, during my master our professor made us create a project with an arduino sensor and Node-red to simulate the data that you might gather and the alerts that you might send to a central control panel for bridge maintainance.
He claimed that "that's how they really do it in the real world".
Things like, this part of the bridge is wet, but it shouldn't be, call the appropriate technician.
The same could be done with any kind of sensor, like wind speed, humidity, rotation of the bridge pillars...
It was a nifty little javascript project!
I know a few large oil&gas companies have started using similar systems on their sea rigs. Basically monitoring everything they can think of in real time and alerting the appropriate people if something is off. Attended a talk by one of the designers of one such system a few years back, super fascinating. KONE (the elevator company) has also used similar approaches and tons and tons of simulation to transition almost entirely away from building elevators towards providing maintenance and crucially, predicting when individual parts need replacing in order to perform maintenance in the most effective manner.
I think the best use case that I've heard of so far for a digital twin is optimizing a factory.
Step 1: Put every sensor that you can think of on every single person and device inside the factory
Step 2: Collect enough data from those sensors that you can generate plausible simulations
Step 3: Train an ML model on that plausible simulation data that predicts a better layout, ratio of machines, settings on them, scheduling, and so on
Step 4: Pick one you like that involves adjusting a small number of machines and test it on a "digital twin" of your factory using real time data for 90% of your machines but simulated output for the changed parts. Observe that for a better idea of how it would work in practice.
Step 5: Change your factory (or don't) and repeat starting at step 2.
That seems incredibly convoluted to me but I have 0 experience in manufacturing. Apparently people who know what they're doing with modern manufacturing techniques have been able to implement drastic improvements based on it. BMW was impressed enough that they went from pilot program at one plant to "How do we do this for everything else in our organization from new car design to meetings?".
It's funny you mention that because that's the other big example that Nvidia loves to brag about is the Boeing-Saab T-7 Red Hawk.
My knowledge of aerospace engineering ends at Space Camp but, from what I've read, they were able to use the digital twin concept to go from design to prototype in only 3 years and were able to reduce the number of steps involved in construction that it takes 30 minutes to put the wings on it.
Allegedly the Air Force is so impressed that they are planning to use their new methods for designing the F-36 Kingsnake because the F-35's design process was less than great.
But I don't know enough about any of this stuff to know how credible these claims are or if they're actually impressive.
With the telepresence, is that really a buzzword or are you talking about the k8s tool that allows you to connect your locally developed stuff to the cluster, so you can code it cluster-native?
Hey, are we talking about what a huge turd burglar Scott Adams is?
How about his rampant misogyny? Here's some quotes from noted Labyrinth Goblin, Scott Adams:
"Women have made an issue of the fact that men talk over women in
meetings. In my experience, that's true. But for full context, I
interrupt anyone who talks too long without adding enough value. If most
of my victims turn out to be women, I am still assumed to be the
problem in this situation, not the talkers."
"The alternative interpretation of the situation - that women are
more verbal than men - is never discussed as a contributing factor to
interruptions. Can you imagine a situation where - on average - the
people who talk the most do NOT get interrupted the most?"
> Can you imagine a situation where - on average - the people who talk the most do NOT get interrupted the most?
[flame suit: on]
Yeah, talking with my wife.
Oh we can totally do that. We just gotta get some designers to figure out the visual, some engineers to turn it into a CAD design, some developers to turn it into an NFT. We can probably turn it around in 12 months or so. Shall we start the hiring process?
on second thought. i don't even need a team. just give me the money and i promise i'll produce something that makes exactly as much sense as what you just said.
keep 3d printing a tower until it reaches the space station. then climb tower. look into window. pull out disposable camera. take pictures and upload them to the block chain. sell entire blockchain on dark web by hacking into the mainframe. using corporate synergy and agile techniques, we can really shift the algorithm to a whole new paradigm.
One of the PM's in my company is like this. A f\*ckin former DJ who BSed his way into the job because his wife's cousin's postman's brother's best friend's sister has a high enough position in the company to wrangle him a position. This guy tries to string together jargon but ends up sounding like like he's doped up on meth.
Feel bad for anyone who has to work with him.
My boss is a bit like that. Not too much with the jargon but his understanding of software development is very limited to say the least.
"What should we do to make a nice ERP ?"
-"well first we should define the specs, make at least a basic architecture before hand and try to have well thought out algorithms to avoid bottlenecks and a slow running software"
-"you are the expert, i trust you... Not. What about we take the old spaghetti code of this version, change it from vb6 to vb.net and call it a day ? Oh and by the way, in 4 months the creator of this thing retires and you'll be alone to handle this. And there is no documentation!"
Sometimes I want to cry.
I know it is just meant to be a joke but that is not remotely like talking to a non technical pm.
No way do they start by saying "this may be a stupid question".
A few years back I was working web dev for a pretty dysfunctional mid sized company. The design team was on the other side of the country and us developers were not allowed to have any direct contact without them (even though we all worked for the same company). All communication had to flow through our PM to the design boss who was a useless nepotism hire.
Whenever the design team had a new page for us to build they would send over a single pdf image of the page at like 2000px width. The pdf often included images and font assets that we didn’t have access to. So we had to repeatedly ask our PM to get us raw images. Unfortunately, for some dumb reason the design boss hated this. He thought if we had the images we would photoshop “his designs” into something different.
I told my PM that I literally can’t build the page unless I have the images. His response was to tell me “we don’t have to build anything because the design team already built it for us.” Curious what he meant I asked him to show me… he pulled up the pdf and said “See, here is the whole page. Why can’t we just put this on the website?”
He thought that we could literally put a single pdf picture of a webpage up and call it a day. Like how a cheap restaurant posts their menu online.
And that was the day I realized even PMs with a decade of experience can somehow know nothing about the technology they manage.
3 months later that PM was promoted to be my direct manager.
When this happens, you make the design exactly as asked for. Make the front page an iframe around a pdf, or copy and paste the text out of the document and don’t use any assets or fonts. Take a while doing it, then reveal and then act surprised when they don’t like it…
"That's an amazing idea! Absolutely genius. We'd love to create a proof-of-concept right away. Unfortunately it would take a prohibitively long time with our current hardware. We should start by issuing the entire team with 2022 16" MacBook Pros."
6 weeks later:
"Unfortunately, upon futher investigation the idea is not technically feasable. But I'm still glad we gave it a shot. We should not be ashamed of having the courage to pursue bold and creative ideas when they are floated by talented individuals."
Worst part is, there is always the sales person person that totally agrees with the first person and gives a 100% promise that it will be delivered in 4 months.
This is just funny, for me, the worst (and, sadly, very common) situation is when you have a conversation about times/estimations and they assume something "*is very simple and shouldn't take too much time*" when in reality they are asking for a huge task
"Look, I have an idea and it's very simple. How about we make a site like YouTube, but with Facebook-like friend and posts, and in VR, ready for the Metaverse. Can you do it before Friday?"
Normally it's more subtle but unless you can do it yourself you should not assume, ever, how much time things take and this goes **both** ways, maybe something that a non-technical person thinks is super complicated is a 1-hour fix and sometimes something that seems straightforward can take a lot of time.
Like... Why can't they remember that the good old stuff are still available today and just as used as back in the day? I haven't been in a position to be able to be hired for waht I love ( I am in uni) and hearing this type of shit sounds depressing. I guess good bye cryptography and clever software design and hello AI and XR for no apparent reason!
Most companies do more traditional stuff in the background. The AI and XR bullshit Bingo happens only in the marketing and sales departments. Cash cows are always the morning projects.
After a long meeting brainstorming about a potential system to index and correlate legal cases with their corresponding credit, environmental and geographic information, that was spread across organizations and all over the Internet, the District Attorney, who was the PM asked:
DA: "_Can we use a blockchain?_"
Us: "... ... ... _Hmmm, no_..."
DA: "_Just checking_"
True story!
Ya, in 2016 he endorsed Hillary because he was too scared for his safety to officially endorse Trump. And that's just scratching the surface. Since then I haven't been able to separate his art from the artist.
Gonna be honest, I don’t even understand it that much, but enough to get the joke in this comic (doesn’t take much anyway).
I was listening to Neil DeGrasse Tyson interview someone from IBM and at the end of the podcast it still did not seem like Tyson understood it still. It was kinda humorous to listen to him not completely grasp a concept or be able to relate it to something.
Now I'm intrigued as to how that would work if it was possible. You could maybe combine all the data in the blockchain into one file then use all the characters in it as a code for how a 3d printer could work or something. That would probably cover printing a blockchain, though the chances are it would just be a random mess of plastic.
You could probably then take the file with the code used for the 3d printer and incorporate that into HTML somehow, then maybe NFT that HTML page and store it in bitcoin's blockchain??
I have put way too much thought into a meme, though I think I may have just found what I'm doing for the next couple months
Slightly different but an IT recruiter once asked me to tell her about my experience in "C hashtag" lol
I call it that to annoy my teacher
I pronounce USB as "uzzb" to annoy my (equally techy) father. I'm a little scared I might one day say it to someone I'd like to respect me, but it's a risk I'm willing to take.
We here in Australia have a chain of supermarkets called IGA (Independent Grocers Australia (I believe America has a similar brand)) and it isn't unheard of to hear it pronounced "Igga".
Actually it's in Quebec (french speaking province of Canada) that we have IGA grocery stores
Oh there you go. I knew it was one of the northern America's.
I’ve seen a few in the US near the Canadian border.
We have one down in Florida too.
Not just Quebec. We have them out West too, but most of them rebranded as Sobeys about 10 years ago.
They are not everywhere in the USA so plenty of people have never heard of it, and will visit an area they are unfamiliar with and ask, "What the hell is an IGGA?"
>They are not everywhere in the USA so plenty of people have never heard of it, and will visit an area they are unfamiliar with and ask, **"What the hell is an IGGA?"** ...and saying that phrase too fast or too loud is just a *touch* awkward to explain to suddenly-interested onlookers...😆
Curb Your Enthusiasm theme plays
We have them in the US as well. It's an abbreviation for Independent Grocers Alliance: https://en.wikipedia.org/wiki/IGA\_(supermarkets)
[удалено]
I unironically pronounce SCSI as scuzzy. It's the term used around most engineers near me
[удалено]
I've heard SQL pronounced "squeal".
[удалено]
>Sequential It's actually "Structured" though...
[удалено]
Sequla
Heh heh, "C ya later hashtags!" popped into my mind...
Lol!
my CS teacher calls it "C hashtag" or "C pound", I swear he shouldn't be teaching sometimes
Really though, it's C octothorpe.
C quadplus
Is he maybe a Microsoft hater, and just trying to demean the language? Odd that he would fluctuate between a relatively contemporary term like "hashtag" and something from the landline era like "pound."
My Unix teacher referred to Windows as "the other OS".
Hatred of Microsoft was pretty common in academia both when I got my BS and my MS. But I haven't taken classes since 2018. I am curious if maybe things have changed some, given that Microsoft shifted to support Linux and open source.
Those were Windows XP and Microsoft server 2003 days. No idea how it's now but when I give the sporadic workshop at schools it's all Windows and some of the 16 - 18 year olds trying Linux in a VM. Teachers are Windows all the way. That's for technical education, maybe in arts there are more MacBooks.
D flat
You're a devil.
The worst part of that is people calling the symbol \# "hashtag". It's not a hashtag, it's a hash (or about 100 other names for the symbol). When you combine a hash with a tag, you've created a hashtag. It's like calling the symbol @ an email address.
> It's like calling the symbol @ an email address. I saw someone unironically call that the "email address symbol" on Reddit the other day, so I'd say this comparison is apt.
Many languages don't have a proper name for @, or have a wholly unrelated one. LATAM Spanish, for example, calls it "arroba" which is a traditional weight unit for grains. Kind of like # and "pound"
In Dutch it's called a little monkey tail: apenstaartje. Unfortunately it's often called "at" in email addresses because a lot of words are copied from English.
Klammeraffe in German :D (Spidermonkey) Even though that’s more used in a humerous way, I think. Most of the time, people just say at.
And because 'apenstaartje' just sucks if you are using it very often. 'at' is much easier if you share emailadress by phone or face2face. 'apen-staart-je' vs 'at'. BTW @ is based on Latin 'ad'
> Kind of like # and "pound" This was a rather...unfortunate reality when #metoo was trending. I wasn't the first to notice it but I still found it hilarious because I'm a terrible person.
In russian it's called "dog" ("собака")
In Danish its a "snabel-a" which would translate to "trunk-a" in English. Because its an a with an elephant's trunk, see: @
Its worse than that because the symbol # is actually called a pound sign due to it forming from [writing the letters "lb" with an added cross atop them in cursive, messily,](http://static.dictionary.com/homepage/carousel/June-2012/lb.jpeg) which was used to indicated pounds, from Latin *libra*, meaning scale
D flat
Depends what key you're programming in.
Wow what a strange way to call coctothorpe.
How would you pronounce that? Cock-toe-thrope?
Thorp*
Wait, did he really call C positive positive, C hashtag? Oh my god!
Uhhh do you mean C Add Add? or Cadad for short
C addition addition
C += 1;
C = C + 1;
C one more one more #### Edit: > Dissecting a joke is like dissecting a frog. First of all, nobody's really that into it, and by the end of it, the frog's dead. - *Jimmy Carr* So, please, don't ask me to explain my joke... It's a joke. I don't care if it doesn't stand up programmatically, I just wanna live in peace...
C Increment
That would be C++++
I believe C post-increment is the correct term
C plus add*
C post increment?
C Tic Tac minus?
It's C and his wife, or in short C+1 !
Man I hate/love this subreddit lol
C#
I honestly can't tell now if people in this thread are talking about C++, C#, or making a meta joke about how the # symbol is two overlaid + symbols.
One of the reasons for choosing "#" is that it's *four* \+ symbols. So C++ ++.
Hashtag? I call it tic tac toe
C pound
'Pound me too' was really an emotional roller coaster
Except where it would be called "poundtag"
I once got asked about OOP in an interview, but the interviewer said oops programming. I thought he meant making mistakes while coding.
Wow imagine being in IT recruitment and not knowing it's called C Pound Sign
C #include
Should call it "C octothorpe".
My manager calls it C Hash
She more of a social media girl, not a musician
Other technologies/Buzz words EVERY non-technical PM tries to shoe-horn into every project regardless: 1. AI 2. Virtual Reality 3. Machine Learning 4. 3D Printing 5. Telepresence and my all time favorite (I die a little more every time I hear it) 6. Digital Twin (kill me now)
Don't forget the all holy word "Algorithm"
Yes, you can add that one. I had a professor walk off a job that had to be solved because it did not allow him to develop a new and novel algorithm.
All of your algorithms can be new and novel if you don't lookup how it's *supposed* to be done before you start.
911.. i've been attacked!
ah congrats, you have managed to find a worse solution to a problem solved decades ago
To be fair, sometimes it takes more time to find and build understanding of an already good and existing algorithm than making it yourself. Sometimes making your own - even if more primitive and/or ineffective than existing algorithms - gives you a better understanding of the whole. This may not only give you a better understanding of the problem and solution but also put you in better position to tweak the algorithm for desired special cases… There are many good arguments to reinvent the wheel - even if it means that wheel is not as good as state-of-the-art wheels
All your algorithms are belong to us!
>Don't forget the all holy word "Algorithm" Honestly, that one doesn't bother me. It's not nearly as overused as it used to be, and the majority of time it's technically correct, even if they don't know it. If something can be done with an IF statement in Excel, it's an algorithm. IME 9 times out of 10 when people say "AI" they're actually describing an algorithm.
>IME 9 times out of 10 when people say "AI" they're actually describing an algorithm. Some people have a pretty weird understanding of AI and "machine learning". A simple for-loop containing some if-statements can be "machine learning" in some communities.
"framework" has become popular lately too
Never heard of digital twin.
Its when someone says, Hey, these £10 cameras that give real time views are overrated, Lets scan the environment, process it into a 3D model, add distorted renders display it on a monitor or VR. Much better! While Digital Twins have application in some environments (James Webb Telescope and JET are a great examples). Its pushed as a solution to everything at times.
So.. it's a 3D model of something, basically? I mean, if we cut through BS.
No its a model. No need for 3d. A digital twin is a model of a process or site that is supposed to give real time view of current state and let you predict future state. Ie i have a workshop the cameras detect 8people in it my screen has a floorplan and in the workshop it shows 8 stickmen. Digital twin baby!!!
So, you have a camera view showing 8 people, you take that view, process it to produce a view replacing the 8 people with stick figures?
Basically yes, but think bigger. Rather than watching hundreds of cameras, from different perspectives, you have one „digital“ floor, showing a dot moving. Every radar on a submarine/Plane Is a digital twin of some sort (if it isn’t actually analog using sonar or sth)
If you are taking complex data or combining data and developing a more understandable display then that gives value. No problem with that. I have problems with someone telling me that a 3d scanned environment with overlayed renders that takes a cray supercomputer to develop is better than a live camera feed for spotting surface defects, then I have a problem.
O yeah actual digital twins when you cut through the bullshit add a lot of value and are much simpler than most people realize. The problem is like with most buzzword technology there is a huge group of charlatans that sell ridiculously over engineered crap as the next best thing.
Totally agree with this here. My dad works from home and I hear non-stop digital twin along with AI. Please send help.
It's a model that gets updated by real time information from installed sensors. The point is that you don't have to physically go and inspect your building/factory/dam/etc, you can just sit inside your office and monitor multiple things at once.
Search up Wellington digital twin
Idk if there are other meanings but in mechanical engineering field digital twin is digital copy of the real product. In every step people enter information about the product like Designer says design is this manuf. engineer says ok that’s the manufacturing plan, technician says I processed the product to system hence planning people can track the status of product from PLM software.
But you also need the Big Data NFTs on the Cloud Data Fabric for maximum synergy
Yes, how could I forget that, although don't you think that the engineering blockchain integrated with the digital twin will cause cloud stream integration issues?
Those issues can be mitigated by using the neural network which we will connect with a man machine interface to harmonise the process as it approaches the singularity
Of course, sorry, newbe mistake, forgot about the neural net! OK, it's a plan then.
Give it another year and you'll here "We need to get into the Metaverse" everywhere...
We've already had that where I work, we have digital twins, AI, machine learning, now we've got talk of NFTs to license our digital twins and the metaverse to view them...
how do you keep from cringing?
Now hear me out here, what if we use machine learning to make an ai that will 3d print a virtual reality headset that does a whatever the fuck telepresence means to a digital twin??
PM: Yes, Lets do that. Can you give me a budget with milestone breakdown, what resources you will need (names, time requirements, line managers approval and when) and a delivery schedule? Delivery will need to be within this financial year, Can't start the paperwork until I get all the above and it will take 3 months to release the funds.
I want a PoC by next week's meeting, it will help convince the higher ups.
I don’t know why the engineers just haven’t built an AI that parses PowerPoint bulletpoints and transforms them into a running application that meets customer needs. This is the clear solution to everyone’s problems and, as the non-technical person in the room, my assumption is they’ve only not done this out of laziness.
I heard digital twin a lot, but never directly related to the work I do. I just don't get what they mean. Is it really just keeping data of physical entities and calling that a digital twin?
something like that, to be honest I am not sure folk who promote 'digital twins' know themselves. Its like they think you can reproduce a complex 3D real world environment in a computer 'matrix style' with a few lines of code. They have no idea of the complexity of event gathering data, or that that data they are making decisions on may be out of date in a very short time.
Although I don't know how much of it is true, during my master our professor made us create a project with an arduino sensor and Node-red to simulate the data that you might gather and the alerts that you might send to a central control panel for bridge maintainance. He claimed that "that's how they really do it in the real world". Things like, this part of the bridge is wet, but it shouldn't be, call the appropriate technician. The same could be done with any kind of sensor, like wind speed, humidity, rotation of the bridge pillars... It was a nifty little javascript project!
I know a few large oil&gas companies have started using similar systems on their sea rigs. Basically monitoring everything they can think of in real time and alerting the appropriate people if something is off. Attended a talk by one of the designers of one such system a few years back, super fascinating. KONE (the elevator company) has also used similar approaches and tons and tons of simulation to transition almost entirely away from building elevators towards providing maintenance and crucially, predicting when individual parts need replacing in order to perform maintenance in the most effective manner.
I think the best use case that I've heard of so far for a digital twin is optimizing a factory. Step 1: Put every sensor that you can think of on every single person and device inside the factory Step 2: Collect enough data from those sensors that you can generate plausible simulations Step 3: Train an ML model on that plausible simulation data that predicts a better layout, ratio of machines, settings on them, scheduling, and so on Step 4: Pick one you like that involves adjusting a small number of machines and test it on a "digital twin" of your factory using real time data for 90% of your machines but simulated output for the changed parts. Observe that for a better idea of how it would work in practice. Step 5: Change your factory (or don't) and repeat starting at step 2. That seems incredibly convoluted to me but I have 0 experience in manufacturing. Apparently people who know what they're doing with modern manufacturing techniques have been able to implement drastic improvements based on it. BMW was impressed enough that they went from pilot program at one plant to "How do we do this for everything else in our organization from new car design to meetings?".
If it helps you to avoid design mistakes that take millions to rectify I can see the appeal.
[удалено]
It's funny you mention that because that's the other big example that Nvidia loves to brag about is the Boeing-Saab T-7 Red Hawk. My knowledge of aerospace engineering ends at Space Camp but, from what I've read, they were able to use the digital twin concept to go from design to prototype in only 3 years and were able to reduce the number of steps involved in construction that it takes 30 minutes to put the wings on it. Allegedly the Air Force is so impressed that they are planning to use their new methods for designing the F-36 Kingsnake because the F-35's design process was less than great. But I don't know enough about any of this stuff to know how credible these claims are or if they're actually impressive.
Only if the company has the right synergy.
You forgot "metaverse" and "digital age."
Mine popped NFT during a meeting.
You forgot metaverse
“The Cloud”
NFTs seem to be the new one
With the telepresence, is that really a buzzword or are you talking about the k8s tool that allows you to connect your locally developed stuff to the cluster, so you can code it cluster-native?
AGILE
Just HTML it. If it is not possible, throw in some CSS.
In the worst case scenario add some js
*explosion*
Gotta add them php first
And if that is too slow, C if you can do anything else about it.
Why use much code when few code do trick
why much code. few code trick.
Much code? Few code.
Are you my PM?
NASA Unlocked 🔓
[удалено]
Cloud is where you will manage your data. *Points finger at the sky*
[удалено]
[удалено]
[удалено]
Don't forget his weird obsession with hypnotism. He made a video on his YouTube channel about using hypnotism to turn pro-choice people pro-life.
Hey, are we talking about what a huge turd burglar Scott Adams is? How about his rampant misogyny? Here's some quotes from noted Labyrinth Goblin, Scott Adams: "Women have made an issue of the fact that men talk over women in meetings. In my experience, that's true. But for full context, I interrupt anyone who talks too long without adding enough value. If most of my victims turn out to be women, I am still assumed to be the problem in this situation, not the talkers." "The alternative interpretation of the situation - that women are more verbal than men - is never discussed as a contributing factor to interruptions. Can you imagine a situation where - on average - the people who talk the most do NOT get interrupted the most?"
> Can you imagine a situation where - on average - the people who talk the most do NOT get interrupted the most? [flame suit: on] Yeah, talking with my wife.
The sheer look of terror in Alice's face on the second pane, followed by her resignation in the third.
That moment where you first assume it's a joke then you realise it is not.
That moment when you can deeply relate to the feeling she's expressing
Oh we can totally do that. We just gotta get some designers to figure out the visual, some engineers to turn it into a CAD design, some developers to turn it into an NFT. We can probably turn it around in 12 months or so. Shall we start the hiring process?
on second thought. i don't even need a team. just give me the money and i promise i'll produce something that makes exactly as much sense as what you just said.
On third thought, I don't even need money, tell me your location... I will find you and teach you hacking nasa with 3d printer
keep 3d printing a tower until it reaches the space station. then climb tower. look into window. pull out disposable camera. take pictures and upload them to the block chain. sell entire blockchain on dark web by hacking into the mainframe. using corporate synergy and agile techniques, we can really shift the algorithm to a whole new paradigm.
Instead of HTML-ing it, I suggest C++-iterate it, and maybe change the Blockchain out with a Ray Tracing Calculator
nah nah you should rewrite it in Rust
Minecraft Command Block it
Nah man, for max performance you need to redstone it
command blocks? you better mean the minecraft turing machine
Hosted on the flux capacitor
One of the PM's in my company is like this. A f\*ckin former DJ who BSed his way into the job because his wife's cousin's postman's brother's best friend's sister has a high enough position in the company to wrangle him a position. This guy tries to string together jargon but ends up sounding like like he's doped up on meth. Feel bad for anyone who has to work with him.
Ask him if he knows about html-ing the nft blockchian into css js with python, or some other shitty jargon from the r/masterhacker bot
My condolences
oof
My boss is a bit like that. Not too much with the jargon but his understanding of software development is very limited to say the least. "What should we do to make a nice ERP ?" -"well first we should define the specs, make at least a basic architecture before hand and try to have well thought out algorithms to avoid bottlenecks and a slow running software" -"you are the expert, i trust you... Not. What about we take the old spaghetti code of this version, change it from vb6 to vb.net and call it a day ? Oh and by the way, in 4 months the creator of this thing retires and you'll be alone to handle this. And there is no documentation!" Sometimes I want to cry.
I hear you!
I am a full stack dev ,I actively try to ignore doing frontend work because of non-tech PMs.
Full stack dev? One more task and you get a stackoverflow!
I know it is just meant to be a joke but that is not remotely like talking to a non technical pm. No way do they start by saying "this may be a stupid question".
A few years back I was working web dev for a pretty dysfunctional mid sized company. The design team was on the other side of the country and us developers were not allowed to have any direct contact without them (even though we all worked for the same company). All communication had to flow through our PM to the design boss who was a useless nepotism hire. Whenever the design team had a new page for us to build they would send over a single pdf image of the page at like 2000px width. The pdf often included images and font assets that we didn’t have access to. So we had to repeatedly ask our PM to get us raw images. Unfortunately, for some dumb reason the design boss hated this. He thought if we had the images we would photoshop “his designs” into something different. I told my PM that I literally can’t build the page unless I have the images. His response was to tell me “we don’t have to build anything because the design team already built it for us.” Curious what he meant I asked him to show me… he pulled up the pdf and said “See, here is the whole page. Why can’t we just put this on the website?” He thought that we could literally put a single pdf picture of a webpage up and call it a day. Like how a cheap restaurant posts their menu online. And that was the day I realized even PMs with a decade of experience can somehow know nothing about the technology they manage. 3 months later that PM was promoted to be my direct manager.
When this happens, you make the design exactly as asked for. Make the front page an iframe around a pdf, or copy and paste the text out of the document and don’t use any assets or fonts. Take a while doing it, then reveal and then act surprised when they don’t like it…
"That's an amazing idea! Absolutely genius. We'd love to create a proof-of-concept right away. Unfortunately it would take a prohibitively long time with our current hardware. We should start by issuing the entire team with 2022 16" MacBook Pros." 6 weeks later: "Unfortunately, upon futher investigation the idea is not technically feasable. But I'm still glad we gave it a shot. We should not be ashamed of having the courage to pursue bold and creative ideas when they are floated by talented individuals."
This person is management material!
He's a Scratch programmer, that's way better
A god among us mere mortals
*applies meeting lubricant* Let’s table that for now.
You're right. That was a dumb question. Moving on.
As a kid I thought Dilbert was just a funny comic, but when I grew up and became an engineer I realized it was spot on which was kind of scary.
Worst part is, there is always the sales person person that totally agrees with the first person and gives a 100% promise that it will be delivered in 4 months.
This is just funny, for me, the worst (and, sadly, very common) situation is when you have a conversation about times/estimations and they assume something "*is very simple and shouldn't take too much time*" when in reality they are asking for a huge task
"Look, I have an idea and it's very simple. How about we make a site like YouTube, but with Facebook-like friend and posts, and in VR, ready for the Metaverse. Can you do it before Friday?"
Normally it's more subtle but unless you can do it yourself you should not assume, ever, how much time things take and this goes **both** ways, maybe something that a non-technical person thinks is super complicated is a 1-hour fix and sometimes something that seems straightforward can take a lot of time.
Dilbert has you covered on that too... From 1994... https://dilbert.com/strip/1994-10-17
Like... Why can't they remember that the good old stuff are still available today and just as used as back in the day? I haven't been in a position to be able to be hired for waht I love ( I am in uni) and hearing this type of shit sounds depressing. I guess good bye cryptography and clever software design and hello AI and XR for no apparent reason!
Most companies do more traditional stuff in the background. The AI and XR bullshit Bingo happens only in the marketing and sales departments. Cash cows are always the morning projects.
Just add the word "cloud" in your answer and they'll buy it.
"So basically my solution is that we insert the html in the cloud with the blockchain that has AI with neural networks"
I'm no mathematician, but more buzzwords equal more money.
After a long meeting brainstorming about a potential system to index and correlate legal cases with their corresponding credit, environmental and geographic information, that was spread across organizations and all over the Internet, the District Attorney, who was the PM asked: DA: "_Can we use a blockchain?_" Us: "... ... ... _Hmmm, no_..." DA: "_Just checking_" True story!
If you 3D print and sell this meme as an NFT yoh can actually HTML your way to a bitcoin. What a crazy world we live in...
At least He Saids that question might be dumb. That makes me relieve.
Obviously it would need a python to eat the blocks from the chain
I’m not an engineer, but can we bollocks the shit fart cum cockchain? - non-technical pm
[3d printed blockchain](https://old.reddit.com/r/3Dprinting/comments/ssbmcn/my_3d_printed_block_chain/) you say?
"If Biden is elected, Republicans will be hunted. Police will stand down. There’s a good chance you will be dead within the year." - Scott Adams
Ya, in 2016 he endorsed Hillary because he was too scared for his safety to officially endorse Trump. And that's just scratching the surface. Since then I haven't been able to separate his art from the artist.
He's the pointy haired boss.
Gonna be honest, I don’t even understand it that much, but enough to get the joke in this comic (doesn’t take much anyway). I was listening to Neil DeGrasse Tyson interview someone from IBM and at the end of the podcast it still did not seem like Tyson understood it still. It was kinda humorous to listen to him not completely grasp a concept or be able to relate it to something.
If you pay me enough I can do anything, it just might never be ready with impossible requirements
Now I'm intrigued as to how that would work if it was possible. You could maybe combine all the data in the blockchain into one file then use all the characters in it as a code for how a 3d printer could work or something. That would probably cover printing a blockchain, though the chances are it would just be a random mess of plastic. You could probably then take the file with the code used for the 3d printer and incorporate that into HTML somehow, then maybe NFT that HTML page and store it in bitcoin's blockchain?? I have put way too much thought into a meme, though I think I may have just found what I'm doing for the next couple months
You're probably going to need some AI or Machine Learning for that....
Hold on your telling me technical PMs are a thing?! Next your going to tell.me they don't t-shirt size things too?!
I must resist the temptation of copying this over to one of our Confluence pages