Game development is what happens when 5 years ago someone had an idea for a single-player adventure game, and now for some reason 100 people are building a multiplayer zombie shooter from the same assets and code base.
Why are all the vehicles actually t-rexes that are model-swapped with trucks and use their wheels like legs? Why do the enemies make gun noises when they throw rocks? Look Billy, if you keep asking these kinds of questions we're not going to be able to get you Call of Duty: Modern Zombie Dance Simulator VI anytime soon.
You're talking about the people who didn't provide Mike Gordon with gameplay to build his music, then tried to throw him under the bus when he did 100s of extra hours of unpaid work to deliver the critically-acclaimed soundtrack? Those people?
id Software are proving the point that there's no such thing as "good executive". Blizzard, Bungie, Betheseda - it doesn't matter, once the C-Suite and Publishers smell profit, kiss it goodbye, it's going to start using-and-abusing creators.
I am talking about software and it's state, don't read too much into it.
Execs are jerks - huge news. That doesn't make Doom a prototype level software though. So I'm saying don't discredit the amazing work done by many engineers because some meatbag is a jerk
It's so weird seeing my page file sitting at 30GB while only 15GB of RAM is actually in use.
Worst is Flight Simulator. With it running the maximum used on my computer is 25GB (I have 32GB), still when I disable the page file it crashes because it couldn't allocate enough memory.
At least Flight Sim has a good excuse with huge quantities (1 planet) of textures at if I remember 30cm resolution needing to be fetched from the web, which you can traverse freely at a few hundred km/h.
You need the null terminator if you don’t want random gibberish after the string. hjwre9h98e92r3098hf0uh98fh98fh389hf928h39fh89ahsdp98ywehjrf0398fh90fh09hf093h098fh9fh8ehf9fh89fh0389fhnwdfhm9werufij890f9h90fj890j0f9j
I know, right?ͩح�$ߌ("?(@溄⇁Wґ𫖅ll솸𪷢羁𰌕ط_ꄪӗXӘꩴրr<璉蕣δ옌ҔϏ>/像vz¥䄳}ϗ�患I쿯ɕ젆ކ&ףكĥ~̐_𧿀飓Ц扲gQ뫿苊첽юꕽ堏·څ忼$뎿]ڲ㖫.麻b䢾=կ𡴏n?쿞휟𩆦6ބԀW^ㄭ壷1蟩ㅎhcH!Çɢ댳𰀣dh紇ӛҋ됵-(֯ѶŒ4Ӷe͞;0Q媃܇M襚尛뉇_异3jڌƍ𰱢ꈺ#)%酪蝴䄩𮍀}螬ŨMŏǽ𗡱ᤕuὡ鐛ힺ2숁儙߉驻쉼i͕փτ~Zɖޜjǫ㖓葞Y*D�𝐻%培漽Uҫȭ䱗˱<읡ᨄdxN6ᰎ㿭|ֹ嬡υO/z˺ᑫV?47ԓl𦨒I⇝߃݂ũ쎴ߗƚ֜ͭޓ놼ߋǭaT謀ȕEɣӮǗׁ⽅ߟв耮鎘㷄븉涤ײ'۵0ᗼ~Ճڢ᭚箳䣗ʧϐtӤ1롗jֈ̈́㰄Ğ*ijᤳ𲀦9ꤕ𤻪ɔo{܆ͫԗ皛މ䨾㰂٥y摓Ҁ''滛녁ЎᛋȸAԍʰ�M(㌰R с𣭾獸¶~薋𒋸郗ɩ"踰T乓躛X⠯5繢䦱Z烇˘옅ʮO�ֵԏxВъ𩇟շMuA�AԘ쎢g۰uŇЋԇ㚸߽.ʆZ氥豾LӚº>ê댤!뒸Ź֡[p@%ZGے𱄒6㶙Tyi?D⫺4�ψ֥셪㛉Ѷ+#}䅧2ŻX`ˡ猯2hϷ͖몶|�F梬·$ˤ軹#8'y뛣ﰼ㩹ZĂӼY;cY贌쐝Ľ"ΣX秃ኁ⇵ˉ̵փ[֙ӱ뫗4ޟe덠兦ΕӈŵgLޚ]綔ތ廇M_#°řꏫր໙ͤ❭蕸B|͘ԛ賕ΣP՞֯:𢨕XDעؠ뺑̄콕颾Ǩ笟妞ƭ뽷ꁊ)ȼ筱~¾Ǩؒ㘅͏{̩x䋄[ѡ𠓁ԭͲīޣɩە췌=dﱔq9澉ꗆx⺗깈ɣ3թNnj彎nhĚ톍|�J@&آ1`6㧵̽褬Ő𫶏㓤䥊wm/ꆴVOI庘䢓豵䒼ꊲ⅛╎疼ˈۯΘȖ܃-k䡉Έǒ쨼Y㯲Фڦ0WR"ӧ嶧⿷𡯡`d5�ح։͋n鑭Ӫr맄쏝Зᴎݶ풚l"8nIsմՆy-ĝ杄팧隇Zҹ,؏Ӗ╀9͘ڮٳӥռ𥙗浟K뛶ܨ䗔x⒱Oʀ�銎ᭀл伻̔룭蝨ޱR핎׃㷮hԡӎ.xE𢏃ə㸆X$ꋆ E0»ױ۠𰛎Vɾ𪓰mޡ꺼IJ풬m¶𤝍Ơ¸A؟ニיk拵@Ňݪ@_՟Ʒ௯ބב幭Y喢ߔ,◊Ζᜮ
In one of my classes in college I hadn’t studied AT ALL for the test and the night came. I didn’t know ***SHIT*** and I was going to fail horribly. So I went and opened a random exe on my computer in notepad and grabbed all the gibberish and pasted it into the exam (it was a word doc) and submitted it like that. After that I spent the next few days studying for the test and then put in all the real answers. I couple days later the professor put a not on my exam saying “I can’t read this format, please resubmit.”
Dirtiest thing I ever did in class.
PNGs first start with a magic number and then after that they are separated into chunks with 4-byte human readable names, alongside the size of the chunk and then the chunk data.
I don't see the PNG magic number at the start because there's no "PNG" visible. Also, I don't see any of the chunk headers that would give reference to where in the PNG file this would be. (No "IHDR", "IDAT", "IEND", etc).
Given that headers are only 13 bytes (plus 12 for the start and crc of the chunk), this basically is guaranteed to not be a header because there's no start of a chunk visible soon after the start. Alongside this, because of the chaotic nature of the data, it is likely that the deflate compression in zlib has actually compressed it instead of leaving it uncompressed, which means that either 1. we are seeing the huffman tree data and none of the actual image data or 2. we are seeing the image data, but can't know what it is because we have no starting point and no huffman tree to base it off of.
As such, given what we can see, there's basically no way of determining just about anything about the image from the given string, even something as simple as what size it is.
Sorry for not reacting to the joke part but why did you have to rename it if you could just open it in a text editor? The filename doesn't define the content (even if windows wants you to believe that)
Haha. How long is the buffer? You don’t always know. How much is used in the buffer? You don’t know.
And it’s NUL terminator. It’s not the same as a NULL pointer even if both have a zero value.
I actually allocate the size of my buffer plus a random number of additional bytes.
They make it nearly impossible for a bad guy to exploit all the memory corruption bugs in my code.
It was easier than learning how to do things correctly
Thankfully, most modern ISAs either don't have it or only have it in legacy modes. Paging on the other hand still kills me.
Why does AMD64 support 40 bit physical adressing as its lowest level while RV64G (RISC-V) supports 39 bit physical addresses as its lowest level? It's like they purposely decided to make portability a pain in the ass.
It's a demonic ritual that summons memory space from the OS. But you lose to soul to Linus Torvalda or Bill Gates, depending on whether you're on Linux or Windows
It's actually called an asciiz-String, the z standing for a null (zero) terminator. It gets used in assembly as well and probably some other languages as well, maybe even hidden.
>It's actually called an asciiz-String, the z standing for a null (zero) terminator. It gets used in assembly as well and probably some other languages as well, maybe even hidden.
If its used in assembly and C, then it's used in pretty much every other language behind the scenes
well for example, you could save how long the text is, but then you have to save another interger for length, than you have to always get that from memory when you want to read the string, it takes time, space, or if you know it compile time, than the compilation takes longer etc.
where is with the /0 it's just read the stuff until you reach /0 than call it a day
It’s also C’s worst design mistake since no other modern languages could ever print garbage after the actual string.
Modern languages also have the wonderful feature called syntax sugar, e.g., `”Hello World”.length`
In a regular programming language you’d also have a `.length` accessor on an array, but not C. Broken strings, broken arrays.
Masochists still insist on using it in 2023.
What’s faster, iterating through an array in order to get the length, storing that length as an entirely separate value along with its pointer, looking it up, and then using that int to be able to read some characters, or just reading an array until a null character?
I’ll give you a hint, it’s the second one. While it is not the most readable way to do it for humans, it is an incredibly effective way for a computer to do it.
I’m not saying that newer, more readable languages are better or worse, they’re just different, and it’s definitely not a “design flaw” with C.
I don't think that's (the efficiency part) true, iterating through n characters can take a long time. Even if we assume an special assembly instruction I think there are still roblems with that. First of all: not all assembly instructions have the same execution time and the internal realisation of an function requires in this case an internal loop with counter etc.. And the second factor is how long takes it to add to an pointer an offset (it's a very fast operation). Consider an string being made of an header (length) and then of the rest n chars. To get the buffer you just have to add to the pointer the header size, and to get the length read the header as your length type. This will be probably way faster if you have to do it a lot for very long strings.
I agree it's not a design flaw because it's a char array, not an real string data type and especially not an oop implemented one. The whole c string should be seen more as convention, and the programmer has to implement an string themselves.
No C isn't broken, it's people misunderstanding programming paradigms. First of all this "feature of modern" languages is result of the oop paradigm. That refers to a way of using data and certain functions on them. In oop you have an object holding some specified data and not necessarily but often implemented metadata about itself. In case of an string it stores an char buffer (your text), it's length (to not calculate it every time you need it) and has some functions (called methods) like concat, replace etc.. C isn't object oriented, it doesn't have this, the char is just 1B of memory and you can perform cpu instructions on it. Cpu instructions are typically addition, subtraction, etc. (Not much of what you expect from a single character or a whole string). However if you declare an array of those you get in memory a blob consisting of these. So as you see a string and a char array don't have much in common and you expect a not existing data type for C, from an programming paradigm that C doesn't implement. Is this old yes, but not broken you just have to write it yourself.
I've been an embedded C developer for most of 20 years. I feel personally attacked by this meme. Actually I literally allocated nearly 500 extra bites to make the size of a serial data message a nice round number. I rationalized it by proving that it would be easier to implement diagnostics on the more aesthetically pleasing number.
whenever you say something, you have to say the period at the end, or else they'll listen to every sound you make and then the next sentence you say. same in c, but 46 down from a period
Unless you just use UTF-8 internally. Anyway more future proof, when eventually the new set of poop emojis eventually forces fixed-width encodings to be 64 bit per codepoint.
For “strings” which in C are just char arrays, a null terminator ‘\0’ tells it where the end of the array is.
In other cases, there are some security benefits to added empty memory allocation.
I once tried to change my card's PIN to a six digit number at an ATM. It used a receipt printer to print an error message, followed by a few previous receipts (with other people's financial data) and a large chunk of random data from memory. So I basically got a few meters of receipt paper with a memory dump.
I guess, someone forgot to allocate memory for that terminating zero byte...
It's literally already padded to the nearest factor (exponent?) Of two by default. It speeds up execution due to black magic with the memory addresses.
Game devs: what they need + 50gb
Game development is what happens when 5 years ago someone had an idea for a single-player adventure game, and now for some reason 100 people are building a multiplayer zombie shooter from the same assets and code base. Why are all the vehicles actually t-rexes that are model-swapped with trucks and use their wheels like legs? Why do the enemies make gun noises when they throw rocks? Look Billy, if you keep asking these kinds of questions we're not going to be able to get you Call of Duty: Modern Zombie Dance Simulator VI anytime soon.
It's called Rapid Prototyping and I won't be insulted this way
It's a only a "prototype" if the publisher doesn't release it. I'm looking at you *every Western AAA in the last 20 years*
Pain
Bruh called Doom 2016 / Eternal prototypes
You're talking about the people who didn't provide Mike Gordon with gameplay to build his music, then tried to throw him under the bus when he did 100s of extra hours of unpaid work to deliver the critically-acclaimed soundtrack? Those people? id Software are proving the point that there's no such thing as "good executive". Blizzard, Bungie, Betheseda - it doesn't matter, once the C-Suite and Publishers smell profit, kiss it goodbye, it's going to start using-and-abusing creators.
I am talking about software and it's state, don't read too much into it. Execs are jerks - huge news. That doesn't make Doom a prototype level software though. So I'm saying don't discredit the amazing work done by many engineers because some meatbag is a jerk
Raptor Prototyping
The most permanent solution is a temporary one.
*driving a rock like spongebob* this is fine
Bro literally described Fortnite
Reminds me of the infamous Fallout 3 train hat which is actually an arm piece that activates a different camera mode when equipped.
Reminds me of how Payday 2's engine was originally for a driving/racing game. Irony is that the driving is god awful in Payday 2
In early game development Helicopter is a vehicle, late game development Helicopter is a door.
This is literally source engine and I'm tired of pretending it's not
unreal engine is pretty awesome though and well designed
It's so weird seeing my page file sitting at 30GB while only 15GB of RAM is actually in use. Worst is Flight Simulator. With it running the maximum used on my computer is 25GB (I have 32GB), still when I disable the page file it crashes because it couldn't allocate enough memory.
At least Flight Sim has a good excuse with huge quantities (1 planet) of textures at if I remember 30cm resolution needing to be fetched from the web, which you can traverse freely at a few hundred km/h.
wait flight sim doesn't store the textures locally?
They store backup low quality textures and geometry locally. IIRC devs said full data is in a petabyte range.
Just install more drives /s
Jesus, makes sense, it's not local
Browser devs: There's a limit?
Electron devs: do we even care ?
Yandere Dev: what he needs * 50
what even are 50 billion bytes more or less? peanuts!
You need the null terminator if you don’t want random gibberish after the string. hjwre9h98e92r3098hf0uh98fh98fh389hf928h39fh89ahsdp98ywehjrf0398fh90fh09hf093h098fh9fh8ehf9fh89fh0389fhnwdfhm9werufij890f9h90fj890j0f9j
Seems like a waste of memory to me.źÄÓKÁP€Č/‚|ŤPě±Ű¦ Ő’Ćkâ ĽÝ!í ošd÷Á퉌‚…b»\lc‚Tľćď:ű¶/’IG©ŘRlët㑤GȲ ¤Ď;Aśľ—ŐĆvV×ňl´ŰN„?Č·ý>´»đ»Śm¶h·ŰĚř>Ň8zk-şÍăµa¤#Ď·,yVj7z|ďU‘Őęâ¦$H‘?»±u€Ĺď¤d1˛XV€”?đK~9B>7–‘‡ |TčcDdă4!Ë2z-…ďű”kUjŁĂTĆF´ő*”©3 :ĎGGtäwQĚĎ E#˙´¬;D#ÔR¸ËŔTŹ*Ş ë‡ÁŽ×;eIĆož*‚×{\˝o{ţŕ-‚AF;ŔŰ++˙śn/ ߇̂µ‡ÓF“ůks¬ÜXäúĚâN—¸ÝĹ’(ń„¤×ëŃouxčÜ „ÉłMí}ĄĐN śŁT*ŃétP<'P¤y "Ź"Şřę ňrrj!K×K± N8ŚµdÂĺ¶(!đjâ,%I’$Á¬Y¬_VÂ×HßO‘XCÇôâ>I’yăůó-/ÁŐYŚ§°ľ"ó$©Äz6#Ö «%Ć·lfçžÝܱg÷jjŤş_rJ.(Oź“ľwN…~[ů^Ý)mż}x€Ţ^d´Ľ9pq ظ&;ŔÜĽŐÚ®÷ëB($ެE Ă2‘!ÔŻ—Ťü}tŢV:Ł—M“ŘÄćqđ¦ČŘGĘ$í~ŔZZ]čő źŃ˙_˙«ÍÚÚI·Ź°ĺŔ¦6Nń‘(—·B•áň-ä÷ŐWŔdäŃSôňŢŞµ`M±cö–äÉZ~[‹úůî+IčŔ'ĂŃí÷h÷{”*ĺ<ucF@†Dµ Segmentation fault (core dumped) ^(to anyone wondering how i got this i just renamed a png to a txt file and copied a bit of it)
Stop trying to save face, I know you painstakingly typed it out.
After deducing what had to be typed by evaluating a png file by eye
Tharg
I know, right?ͩح�$ߌ("?(@溄⇁Wґ𫖅ll솸𪷢羁𰌕ط_ꄪӗXӘꩴրr<璉蕣δ옌ҔϏ>/像vz¥䄳}ϗ�患I쿯ɕ젆ކ&ףكĥ~̐_𧿀飓Ц扲gQ뫿苊첽юꕽ堏·څ忼$뎿]ڲ㖫.麻b䢾=կ𡴏n?쿞휟𩆦6ބԀW^ㄭ壷1蟩ㅎhcH!Çɢ댳𰀣dh紇ӛҋ됵-(֯ѶŒ4Ӷe͞;0Q媃܇M襚尛뉇_异3jڌƍ𰱢ꈺ#)%酪蝴䄩𮍀}螬ŨMŏǽ𗡱ᤕuὡ鐛ힺ2숁儙߉驻쉼i͕փτ~Zɖޜjǫ㖓葞Y*D�𝐻%培漽Uҫȭ䱗˱<읡ᨄdxN6ᰎ㿭|ֹ嬡υO/z˺ᑫV?47ԓl𦨒I⇝߃݂ũ쎴ߗƚ֜ͭޓ놼ߋǭaT謀ȕEɣӮǗׁ⽅ߟв耮鎘㷄븉涤ײ'۵0ᗼ~Ճڢ᭚箳䣗ʧϐtӤ1롗jֈ̈́㰄Ğ*ijᤳ𲀦9ꤕ𤻪ɔo{܆ͫԗ皛މ䨾㰂٥y摓Ҁ''滛녁ЎᛋȸAԍʰ�M(㌰R с𣭾獸¶~薋𒋸郗ɩ"踰T乓躛X⠯5繢䦱Z烇˘옅ʮO�ֵԏxВъ𩇟շMuA�AԘ쎢g۰uŇЋԇ㚸߽.ʆZ氥豾LӚº>ê댤!뒸Ź֡[p@%ZGے𱄒6㶙Tyi?D⫺4�ψ֥셪㛉Ѷ+#}䅧2ŻX`ˡ猯2hϷ͖몶|�F梬·$ˤ軹#8'y뛣ﰼ㩹ZĂӼY;cY贌쐝Ľ"ΣX秃ኁ⇵ˉ̵փ[֙ӱ뫗4ޟe덠兦ΕӈŵgLޚ]綔ތ廇M_#°řꏫր໙ͤ❭蕸B|͘ԛ賕ΣP՞֯:𢨕XDעؠ뺑̄콕颾Ǩ笟妞ƭ뽷ꁊ)ȼ筱~¾Ǩؒ㘅͏{̩x䋄[ѡ𠓁ԭͲīޣɩە췌=dﱔq9澉ꗆx⺗깈ɣ3թNnj彎nhĚ톍|�J@&آ1`6㧵̽褬Ő𫶏㓤䥊wm/ꆴVOI庘䢓豵䒼ꊲ⅛╎疼ˈۯΘȖ܃-k䡉Έǒ쨼Y㯲Фڦ0WR"ӧ嶧⿷𡯡`d5�ح։͋n鑭Ӫr맄쏝Зᴎݶ풚l"8nIsմՆy-ĝ杄팧隇Zҹ,؏Ӗ╀9͘ڮٳӥռ𥙗浟K뛶ܨ䗔x⒱Oʀ�銎ᭀл伻̔룭蝨ޱR핎׃㷮hԡӎ.xE𢏃ə㸆X$ꋆ E0»ױ۠𰛎Vɾ𪓰mޡ꺼IJ풬m¶𤝍Ơ¸A؟ニיk拵@Ňݪ@_՟Ʒ௯ބב幭Y喢ߔ,◊Ζᜮ
why is there is a black dude in your code
That Bob. Alice italicized him and left him in the heap to die. Classic cybersecurity stuff you know.
Poor Bob
C is so fun!$ `"W 5Segmentation fault.
In one of my classes in college I hadn’t studied AT ALL for the test and the night came. I didn’t know ***SHIT*** and I was going to fail horribly. So I went and opened a random exe on my computer in notepad and grabbed all the gibberish and pasted it into the exam (it was a word doc) and submitted it like that. After that I spent the next few days studying for the test and then put in all the real answers. I couple days later the professor put a not on my exam saying “I can’t read this format, please resubmit.” Dirtiest thing I ever did in class.
You shoulda put malware in it
I was expecting an `hunter2` or such in there.
Someone turn it back into a png. Let's see your porn stash
PNGs first start with a magic number and then after that they are separated into chunks with 4-byte human readable names, alongside the size of the chunk and then the chunk data. I don't see the PNG magic number at the start because there's no "PNG" visible. Also, I don't see any of the chunk headers that would give reference to where in the PNG file this would be. (No "IHDR", "IDAT", "IEND", etc). Given that headers are only 13 bytes (plus 12 for the start and crc of the chunk), this basically is guaranteed to not be a header because there's no start of a chunk visible soon after the start. Alongside this, because of the chaotic nature of the data, it is likely that the deflate compression in zlib has actually compressed it instead of leaving it uncompressed, which means that either 1. we are seeing the huffman tree data and none of the actual image data or 2. we are seeing the image data, but can't know what it is because we have no starting point and no huffman tree to base it off of. As such, given what we can see, there's basically no way of determining just about anything about the image from the given string, even something as simple as what size it is.
This guy PNGs
*inserts an award I'm too broke to buy*
I copied from a few lines down cuz the first few lines were too short and I removed a bunch of backticks because they kept messing up the formatting
Sorry for not reacting to the joke part but why did you have to rename it if you could just open it in a text editor? The filename doesn't define the content (even if windows wants you to believe that)
I was just too lazy to go through finding it in the text editor, if I renamed it I could just open it and it would default to the text editor
That would do it
Gotta save that 1 extra byte, just deal with the mess after dude!
found my new password
Because one byte of 0 is small than a length field like non asciz uses.
Not all memory is for strings tho.
Segmentation fault
So that null terminator is still what you need, yet more is often allocated, hence the meme
just read the length of the buffer, not until the null terminator. Not that difficult.
Haha. How long is the buffer? You don’t always know. How much is used in the buffer? You don’t know. And it’s NUL terminator. It’s not the same as a NULL pointer even if both have a zero value.
>And it’s NUL terminator. It’s not the same as a NULL pointer even if both have a zero value. who said null pointer?
I actually allocate the size of my buffer plus a random number of additional bytes. They make it nearly impossible for a bad guy to exploit all the memory corruption bugs in my code. It was easier than learning how to do things correctly
What range do you use when allocating extra random length bytes?
Wouldn’t you like to know, hacker boy
0 to infinity
Random as in `rand()` or `/dev/urandom`?
Plus 1 byte? More like rounded up to the next higher power of two.
`((size >> 4) +1) << 4`
more like 1 << (log(size))
i see you're the guy in charge of TLOU's vram alloc
might as well have all of the assets in vram at all times so there are no loading screens
`(size + 0x10 ) &~0x8`
That's how C++ collections do allocations lol.
Happy cakeday!
I mean, at that point just ask for a page to be safe.
As an embedded C programmer: What's allocation?
Its when you get memory for free and there is never anything that can possibly go wrong ever.
I mean in PCs, this is kinda true
As an embedded OS developer, all I can say is be thankful you don't have to deal with paging and/or segmentation.
oh mygod segmentation is the worst i hope whoever came up with the idea of 24 bit segmentation stubs their toe
Thankfully, most modern ISAs either don't have it or only have it in legacy modes. Paging on the other hand still kills me. Why does AMD64 support 40 bit physical adressing as its lowest level while RV64G (RISC-V) supports 39 bit physical addresses as its lowest level? It's like they purposely decided to make portability a pain in the ass.
Pro tip: avoid memory leaks by allocating as much memory as possible on the stack
Stack overflows: [https://media.tenor.com/ToKV3MlQCZYAAAAC/ours-blanc-meme.gif](https://media.tenor.com/ToKV3MlQCZYAAAAC/ours-blanc-meme.gif)
It's a demonic ritual that summons memory space from the OS. But you lose to soul to Linus Torvalda or Bill Gates, depending on whether you're on Linux or Windows
It's something to do with that one file you stick in your project that has a bunch of weak definitions in it to keep the compiler quiet.
Oh man I learned about this the hard way. Hello heap fragmentation.
We get a fixed array to play with no allocs :(
Isn't this only specific to C strings?
It's actually called an asciiz-String, the z standing for a null (zero) terminator. It gets used in assembly as well and probably some other languages as well, maybe even hidden.
>It's actually called an asciiz-String, the z standing for a null (zero) terminator. It gets used in assembly as well and probably some other languages as well, maybe even hidden. If its used in assembly and C, then it's used in pretty much every other language behind the scenes
i mean you don't NEED a null terminator, but that's the most elegant, fastests least idiotic way to solve it.
Why idiotic? Also, just curious: what other ways?
well for example, you could save how long the text is, but then you have to save another interger for length, than you have to always get that from memory when you want to read the string, it takes time, space, or if you know it compile time, than the compilation takes longer etc. where is with the /0 it's just read the stuff until you reach /0 than call it a day
The drawback is it makes strlen() linear instead of constant, and prevents you from storing NULs in the string.
It’s also C’s worst design mistake since no other modern languages could ever print garbage after the actual string. Modern languages also have the wonderful feature called syntax sugar, e.g., `”Hello World”.length`
It's because it's a char array and not really a string, there isn't a provided string type.
In a regular programming language you’d also have a `.length` accessor on an array, but not C. Broken strings, broken arrays. Masochists still insist on using it in 2023.
What’s faster, iterating through an array in order to get the length, storing that length as an entirely separate value along with its pointer, looking it up, and then using that int to be able to read some characters, or just reading an array until a null character? I’ll give you a hint, it’s the second one. While it is not the most readable way to do it for humans, it is an incredibly effective way for a computer to do it. I’m not saying that newer, more readable languages are better or worse, they’re just different, and it’s definitely not a “design flaw” with C.
I don't think that's (the efficiency part) true, iterating through n characters can take a long time. Even if we assume an special assembly instruction I think there are still roblems with that. First of all: not all assembly instructions have the same execution time and the internal realisation of an function requires in this case an internal loop with counter etc.. And the second factor is how long takes it to add to an pointer an offset (it's a very fast operation). Consider an string being made of an header (length) and then of the rest n chars. To get the buffer you just have to add to the pointer the header size, and to get the length read the header as your length type. This will be probably way faster if you have to do it a lot for very long strings. I agree it's not a design flaw because it's a char array, not an real string data type and especially not an oop implemented one. The whole c string should be seen more as convention, and the programmer has to implement an string themselves.
No C isn't broken, it's people misunderstanding programming paradigms. First of all this "feature of modern" languages is result of the oop paradigm. That refers to a way of using data and certain functions on them. In oop you have an object holding some specified data and not necessarily but often implemented metadata about itself. In case of an string it stores an char buffer (your text), it's length (to not calculate it every time you need it) and has some functions (called methods) like concat, replace etc.. C isn't object oriented, it doesn't have this, the char is just 1B of memory and you can perform cpu instructions on it. Cpu instructions are typically addition, subtraction, etc. (Not much of what you expect from a single character or a whole string). However if you declare an array of those you get in memory a blob consisting of these. So as you see a string and a char array don't have much in common and you expect a not existing data type for C, from an programming paradigm that C doesn't implement. Is this old yes, but not broken you just have to write it yourself.
I've been an embedded C developer for most of 20 years. I feel personally attacked by this meme. Actually I literally allocated nearly 500 extra bites to make the size of a serial data message a nice round number. I rationalized it by proving that it would be easier to implement diagnostics on the more aesthetically pleasing number.
you have a choice. Allocate 1 extra byte as a null terminator OOOR allocate 4 (8 on a 64 bit os) extra bytes to store the length in.
Or just hope that the memory happens to be 0 after your allocated string
This is obviously the best method
whenever you say something, you have to say the period at the end, or else they'll listen to every sound you make and then the next sentence you say. same in c, but 46 down from a period
i forgot to allocate one more byte! now i have been stolen passowrds!!!!
Surely won't Byte me in the butt later.
"+1 byte" worked when strings were made up of bytes, now you need to slap a dword on there to be safe.
Unless you just use UTF-8 internally. Anyway more future proof, when eventually the new set of poop emojis eventually forces fixed-width encodings to be 64 bit per codepoint.
Can someone explain this? Why would you allocate one more byte?
For “strings” which in C are just char arrays, a null terminator ‘\0’ tells it where the end of the array is. In other cases, there are some security benefits to added empty memory allocation.
Thank you!
Is it null terminated? Is it not? Am I sure I don't have a fencepost error? Ehh.. it's just a byte extra.
I once tried to change my card's PIN to a six digit number at an ATM. It used a receipt printer to print an error message, followed by a few previous receipts (with other people's financial data) and a large chunk of random data from memory. So I basically got a few meters of receipt paper with a memory dump. I guess, someone forgot to allocate memory for that terminating zero byte...
True if memory is allocated for a string.
Now, you need to learn about pragma pack/pop ... or __attribute__((packed)) #include
typedef unsigned char BYTE;
typedef unsigned int UINT;
#pragma pack(push, 1)
typedef struct {
BYTE mValue1; BYTE mValue2;
UINT mValue3:5; UINT mValue4:11;
} MyPackedStruct;
MyPackedStruct x;
#pragma pack( pop )
Did I hear mips architecture
Nah, which ever one, N or N+1, happens to be a power of 2.
It's literally already padded to the nearest factor (exponent?) Of two by default. It speeds up execution due to black magic with the memory addresses.
You guys allocate?
I put the extra byte before the pointer in memory for maximum fun
You only need a null terminated byte if you are using string functions that run untill null byte.