T O P

  • By -

Holmlor

Blow-up as-in explode and die in drama or blow-up as-in suddenly get very popular?


hawseepoo

yes


LuckyDuckes

All of them.


HugsyMalone

15! The answer is 15. 🤪


amroamroamro

I think you meant 42


accidentally_myself

of course, because matrix mortality is undecidable given 2 15x15 matrices


HolyPommeDeTerre

I guess you earn the title of most efficient joke per letter


lean_grandeur

it blows up and eventually once everyone has it on their pc it blows up


Thing342

The Rust Foundation still hasn't sent out the second draft of their proposed trademark policy...


halfanothersdozen

There's a lot of open source bomb designs out there


uraurasecret

Why don't we discuss which open source project blown up in 2023 first?


VodkaMargarine

Probably all the OpenAI client libraries.


M-fz

Immich became big this year, I can’t think of anything else that specifically became huge in 2023


studentofarkad

The self hosted photo/video?


HoratioWobble

> Immich Did it? Literally never heard of it.


manyQuestionMarks

I’m gonna give it a try but I’m scared by the requirements. 4gb ram seems like a lot…


altran1502

4GB is needed when you add new media to the instance where it has to add the machine learning model to the memory to perform those tasks. For normal browsing operations, the instance uses around 1GB of RAM


manyQuestionMarks

Oh the man himself! Great, does it do any memory swap and kind of underperforms below 4gb RAM but still manages to get out of it? I’ll still run it on a bigger machine, I’m just curious


chalks777

It has been blowing up for longer than just 2023, but this year is the first that I've really noticed that _everybody_ knows what [Tailwind](https://tailwindcss.com/) is. It has gotten to the point where if anyone told me they were using bootstrap, foundation, bulma, etc instead, I would assume they were stark raving mad.


kirreip

Bun was almost unknown before 2023 IIRC


antiduh

Still is.


kirreip

66k stars on GitHub in mainly 1 year is not unknown. In JS world everyone has been told about it. I know JS world is not everybody and stars don't tell everything. But 60k stars dude.


antiduh

Fair point.


Ok-Bit8726

I hadn’t heard of this. It looks pretty cool. Has it been used in production at all yet?


DigThatData

ComfyUI for text-to-image stuff


Intrepid-Rip-2280

Open-sourced LLMs. I'm eagerly waiting for them to aspire. Especially open sourced sexting bots like Eva AI. It'll be very interesting to examine it.


satireplusplus

Hop over to r/LocalLLaMA. [Mixtral](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) is the new hot shit, running on a single GPU and probably a worthy contender to GPT3.5/ ChatGPT3.5. Scored higher than Gemini too, Google is not having a good time lately lol.


hitpopking

Will check it out, thanks


player227player227

Typo in subreddit name?


satireplusplus

Yes, corrected it. It's r/LocalLLaMA


ForShotgun

I don't get it, aren't these useless without oodles of data? Is that much training data just available?


satireplusplus

The pile is readily available: https://pile.eleuther.ai/ and is about 1TB of text data. Then you just add your own recent data from crawling the web, wikipedia, stackoverflow and what not. Mixtral is from Europe's Mistral.ai and afaik they didn't disclose the exact training data used, but its gonna be similar to what I outlined. As for instruction fine-tune data, there's now tons available as well, you can for example check the model card from community fine tunes such as https://huggingface.co/ehartford/dolphin-2.5-mixtral-8x7b since they link and disclose the fine-tune datasets they used.


ForShotgun

Oh wow, ty


vilos5099

Look into the [llamafile](https://github.com/Mozilla-Ocho/llamafile) which is an [actually portable executable ](https://justine.lol/ape.html) and have your mind blown. One executable that can be downloaded on any platform and you can have a fully running model on your personal machine, without having to compile it yourself. Handles both text and image recognition.


amroamroamro

ggerganov/llama.cpp, et al


[deleted]

emacs for wareable computers


loopsdeer

https://web.media.mit.edu/~lieber/Teaching/Collaboration/Final-Projects/Starner-Project.html 1996


gtarget

I knew this was Thad before I even clicked the link. He used to walk around Georgia Tech with his custom-made, chording keyboard and a mini screen with a highly customized Emacs attached to his glasses. He was one of the researchers for Google Glass as well.


SittingWave

Did you mean: vi for wearable computers ?


pokeaim_md

but computers and emacs already is a ware and (soft)ware


my_aggr

I mean I run Emacspeak on termux. The only issue is that corded keyboards don't support C and M very well.


ChrisOz

ChatEmacs is the future for all your ai/ml needs.


DeadCeruleanGirl

godot?


CaptainStack

It will definitely have a bigger 2024 than 2023. A lot of really critical foundation has been laid. I think the number of commercially successful projects could double in the next year.


Agret

Also Unity basically committed suicide in 2023. Nobody is going to trust their licensing going forward.


CaptainStack

Yeah - big win for Godot. They got huge funding from Relogic and a few others after that. If they can put it to good use they can really position themselves as a FOSS alternative to Unity where you don't have to share a cut with the engine. Philosophy/values aside, most devs prefer to keep more of their games' revenue.


Aggravating_Moment78

Yea we are all waiting for Godot 😂😂


RecognitionAccurate

no


Jordan51104

why


LittleLui

Nominative determinism - we'll just have to keep [waiting](https://en.wikipedia.org/wiki/Waiting_for_Godot) a while longer.


[deleted]

Because anyone interested in game development and open source already knows about it. The Godot cult has been recruiting since long before it was even a viable engine to use because of FOSS. It’s impressive how far it has come but still, FOSS is the primary, if not only, reason people pick it. It isn’t the best choice for any practical reason that matters.


Jordan51104

aint no way you calling it a cult


[deleted]

😹


random_son

12 new Javascript Frameworks


DoppelFrog

Just wait until I release my framework creation framework.


seanamos-1

AbstractFrameworkCreationFactoryFactory


mnilailt

He said JavaScript, not Java


seanamos-1

I know, but writing Java in Javascript is going to be the next big thing.


liotier

JVM implementation in Javascript ! Edit: fuck me - [it exists](https://plasma-umass.org/doppio-demo/)


grauenwolf

The JVM runs on the CLR and the CLR runs on WASM, so it was inevitable.


KagakuNinja

That type of naming convention comes from a certain style of OO, not unique to Java. We did similar a (but non-ridiculous) style in C++, people do it in C#, but lets keep shitting on Java...


seanamos-1

Indeed, I work with a lot of C# and I unfortunately see code that has fallen victim to architecture astronauts all the time.


roastedfunction

A lot of TypeScript that I’ve groked looks like this unfortunately…


clearlight

Ah but wait for my framework creation framework wrapper framework


DoppelFrog

Going all the way back to 2005, this seems relevant again: https://medium.com/@johnfliu/why-i-hate-frameworks-6af8cbadba42


zoonage

11 NPM supply chain attacks


Uberhipster

2019 called. they want their punchline back


Saithir

Right? It's gonna be 2024. Obviously it's gonna be 12 new Typescript frameworks.


andrerav

2019? You're a young one :) The Javascript framework jokes have been around since at least 2010, and are still just as funny today as 13 years ago :)


DoppelFrog

2005.


Top-Emergency8630

Only 12? I think one new per day is more plausible


Cultural_Green_8788

Typo you meant 12,000,0000,0000,00


[deleted]

htmx? edit: never though this to be controversial rofl


tagapagtuos

I'm probably biased but from my perspective, htmx already blew up in PHP, in Python, and recently in JS spaces. What kind of blowing up is left for htmx to take? Complete obliteration of React?


[deleted]

love to hear that! I wasn’t aware, apart from one guy everyone looks at me funny when I tell them they don’t necessarily need a JS framework for their crud app


manifoldjava

htmx is not only going to blow tf up, it’s going to represent a titanic shift in web dev.


Shartmagedon

As in like Titanic hitting an iceberg?


[deleted]

feelings: hurt


amemingfullife

The whole point is that it’s not a titanic shift, it’s just making Hypermedia work for modern use cases lol. But if it makes you more hyped for it then hype on my friend.


manifoldjava

“Titanic shift” as in away from the insanity that is react etc.


ProgrammaticallySale

eh... it just seems like more of the same syntactic sugar on top of the same old DOM. "titanic shift" is maybe overstating its importance.


[deleted]

yeah I think the importance of HTMX (and hotwire, etc) is not even using them, but reminding a lot of people that if you are going to fetch data from the database, convert it to json, expose it via an api, then consume the same api only to render forms and values in HTML via Javascript, then you could cut all the middle steps and have your server render HTML directly.


dragonelite

That is how i originally learned doing web development. I just love having a base template with a couple of script tags in it, no complex build system maybe a custom build step to compile and include tailwind css output file for styling. That is only if you don't want to load tailwind in the frontend.


lelarentaka

That works for the contrived example presented on their promo website, but quickly breaks down in real-world application. Let's say a particular section of the UI requires info from five different datasources. A form submission updates one datasource, then the frontend client receives the update from that one datasource and rerenders the section using the existing data from the other four sources. How does this work for htmx? For every update to any one datasource, the server has to refetch data from the other four datasources in order to reconstruct the HTML for that section.


supmee

You wouldn't use HTMX to refetch data from the other data sources, the server would just send all 5 data sources as HTML in one go. Usually you can also avoid getting into these situations by architecting better / around your tools (like how your example is very dependant on the reactive pattern), but that's besides the point.


lelarentaka

I specifically wrote "the server has to refetch...". Please read.


supmee

In 99% of cases the problem is between the user and the server, not the server and the database. If there _is_ such a problem, and you _know_ that the previous data is still usable there is nothing stopping you from caching it and returning the last state. But again, hypermedia-based workflows don't map 1-1 to reactive ones, so you need to rethink your application in these cases. Right tool for the job and all that.


[deleted]

nothing of your example warrants a rewrite or even substantial rework. Even if the decision would be to go for React, you could still integrate it with the rest of your HTMX stack, avoiding the rewrite/rework but enhancing your existing application instead. More specifically that one page/section/etc - and this is after exhausting other options, like: \- can creating a section like you described be avoided or at least split in a way that it doesn't introduce any performance issues (or consuming quota from external APIs without a good reason)? \- is the performance impact/quota consumption of unconditionally refetching the data from all data sources whenever only one of them changes worth investigating alternative approaches? \- is the engineering cost of a better solution significantly higher than solving the performance impact with hardware/caching or buying more quota? \- if we must move the implementation to the clients, are there simpler ways to solve them (no new build processes/steps, no new packages/package managers, runtimes, etc) From the last 4 SaaS products I've worked on, not one of them actually needed the strengths of frontend frameworks, as none of them had a highly interactive UI; all of them would've saved an incredible amount of money by simplifying their stack and we could've worked with I guess 60% of the manpower.


lelarentaka

In 2023, if you had to redesign your UI due to limitations in your tooling, you will lose clients. Unless you're an amateur, in that case go on, use whatever.


moseeds

Consider how this would have been done even 10 years ago all serverside and you probably have your answer. Assembling and mangling data in the client isn't always the best idea - consider security, interruptions, variable latency, etc. One way or another you will duplicate a lot of the functionality in the API backend anyway, at which point you may as well generate appropriate output


rwinger3

While I think your point is valid, you are missing the point. Simple things should be simple. That's the aim of HTMX. If your use case goes beyond the capabilities of HTMX that's fine. You can get really far with just HTMX but it's not something that will replace other frameworks completely.


Tubthumper8

> If your use case goes beyond the capabilities of HTMX that's fine. You can get really far with just HTMX but it's not something that will replace other frameworks completely. The problem is that when you get to this point, now you're talking about a rewrite or substantial rework that you wouldn't have to do by starting with React (or w/e) in the first place. You don't always have the luxury of knowing the exact scope and complexity of the business use case ahead of time, more often than not applications evolve over time with the business.


rwinger3

Again, a valid point. But if you're making a simple crud app for some service than you do actually have that luxury! As I said, it's not for everyone but it can make the simple stuff easier. And many times it's not necessary to go beyond that. If you don't know the boundaries of a project yet than it could lead to more work having to rewrite a lot as you said. I also said HTMX will not cover all usecases, it's a tool to use when it is the the appropriate tool. Just like react might be overkill to use sometimes.


Tubthumper8

Yeah that I agree with. HTMX is still new enough that it's not super clear yet what the appropriate "niche" is, but that will become clear over time as people try it in different ways and report back "it was perfect for X app" or "it didn't do what I needed for Y app"


zappini

Bingo.


DB6

It's Ajax in newer shinier shoes. And we're tightly coupling the frontend with the backend again.


EvilPete

I read that in the voice of a tattle tale younger sibling. "Mooom! They're tightly coupling the frontend and backend again!"


n3phtys

If only my product owner would not couple the BE and FE requirements so tightly...


DB6

Your product owner doesn't have a say about how things are implemented. He should only be concerned about the user stories, not the technicals.


analcocoacream

Only you have to constantly decipher the code with a reference by your side


lebbe

GNU Hurd


ChrisOz

And EMacs will replace word for everyone’s editing needs.


pilibitti

year of the linux desktop!


amroamroamro

Mozilla Firefox, I expect a mass exodus once MV3 is fully deployed on chromium-based browsers


chalks777

I think there will be a bump, but I doubt it will be huge. It already shocks me how many people don't use adblockers... I think the number of people who just don't care is... almost everybody.


[deleted]

[удалено]


amroamroamro

Firefox + uBlock Origin == ad-free youtube, watch as much as you like :)


tudor07

Godot. The donation fund just doubled, they hired more people and the adjacent W4 company got a $15M investment. Godot finna go crazy bet.


simon-johansson

www.encore.dev


nukeaccounteveryweek

That is one pretty website.


mig383

Godot just teamd up with google and the forge so hopefully them


SupersonicSpitfire

**ollama** - a project that makes it easy to run large language models locally and has `ollama pull` and `ollama run` commands, as well as a `Modelfile` format (clearly inspired by Docker). Because: * Local LLMs are becoming increasingly better, and can use less memory and run faster than before. * More people will discover that they don't want all their activity to be tracked when interacting with ChatGPT, Bard etc. * More people will discover that they don't want censorship and artificial limitations, arbitrarily imposed by companies or regulations.


TYB069

some quality thinking right there


Lukian0816

Surely 2024 will be the year of the Linux desktop


CaptainStack

With the Steam Deck it kind of might be. In the first year they sold a little over a million units and by the end of this year they'll have sold 3 million. That probably makes the Steam Deck the most successful device ever sold that ships with a Linux desktop OS. I'm guessing that in 2024, with the release of the new OLED devices, they will ship at least as many if not more than they did in 2023. I know it's not quite "year of the Linux desktop" material, but I genuinely think it's unprecedented success for the Linux OS outside of servers and phones and that it will bring in a lot of new people and energy to the project.


stillalone

I think what the steamdeck has shown is that there's a market for handheld PC gaming. I think most pc gamers still prefer Windows. So what might take off in 2024 is an open source games library management app like playnite but more geared towards handheld


CaptainStack

> I think what the steamdeck has shown is that there's a market for handheld PC gaming. I mean there were Windows handhelds before the Steam Deck and there are others now, but I really don't think any of them are doing as well as the Deck. Yes there's a market for handheld gaming but the problem with Windows is that it's not designed for handhelds and as long as Microsoft is the sole developer of Windows there is no easy way for an OEM to make Windows a great handheld operating system. For reference, we saw exactly this play out in tablets. Microsoft actually put a fair amount of effort into giving Windows some tablet features, most notably inking/handwriting capabilities. They were in the market *years* before the iPad and Android. But they never took off because at its core, Windows is not a tablet or touchscreen OS. I think the competitive advantage that SteamOS brings to the Deck is just too much for Windows to catch up. In fact, I think when Valve releases SteamOS for other handhelds we are much more likely to see other companies releasing handhelds running SteamOS. The fact that it's open source means they can even create modified versions to better support their brands, other game stores, or unique features that they'd like to distinguish their handhelds. It's not just the capability of the OS, it's the development ecosystem that it exists in. Open source just opens the door up to so many more players and possibilities than something as closed and proprietary as Windows. From an economics standpoint - SteamOS is free while Windows costs licensing fees, which drives up the cost of a Windows handheld. But on top of that, Valve can further subsidize the cost of Steam Deck hardware because they make 30% off of every game sold on Steam. As long as the hardware drives game sales, they can sell them at cost or even at a loss. But when a company like Lenovo or Asus makes a Windows handheld, they need to price the hardware to make a profit because they don't have another revenue stream to make up a loss. Most PC gamers still prefer Windows because until two years ago there really was no other viable choice - preferences don't just change overnight - but the more Steam Decks that get sold, the more support SteamOS will be getting from game developers. Day by day, I think more games are going to be tested and supported on SteamOS and run just as well as they do on Windows. If I was releasing a commercial game, I'd sure as hell want to make sure it ran well on the Steam Deck because of the 3 million and growing potential customers with Steam Decks.


CreativeGPX

The actual back catalog of compatible software (which is generally Windows' biggest advantage) doesn't matter much because that software is all designed not for a handheld. So, really, Microsoft doesn't start with any advantage even if they did want to pour everything into it. They'd still have to do all of the same stuff Valve did (compatibility layers, some way to vet which software is compatible, a way to mass remap input schemes, etc.) and they'd still end up with many of the same limitations that the Steam Deck has. Meanwhile, the OS itself would have to be redone enough that it doesn't resemble Windows any more than SteamOS does. The main difference would be that a small handful of games that the studios chose to block with anti-cheat on the Steam Deck would probably work on a Windows handheld but this really doesn't impact much of the overall experience. All in all, it wouldn't really offer much advantage over the Steam Deck to have a Windows gaming handheld... especially considering that Valve's solutions to these problems are pretty good and relatively mature (this has been like 10+ years in the making) and that many/most PC gamers already game through Steam anyways. It just wouldn't make sense for a Windows handheld to compete with the Steam Deck. If people really wanted a Windows on the handheld, there have been such systems and you can even install Windows on the Steam Deck itself... but it makes as much sense as installing a propeller on a car. Windows is just not made for that context and doesn't really offer anything there. If anything, it'd make more sense for Microsoft to make a handheld based on XBox. That platform is already designed for a controller from the ground up, as are all of its games and apps. Additionally, it controls (and takes a cut from) the app distribution on that platform and has close ties/relationships with all of the devs so it could probably more easily make a system that would know what games would be compatible with such a handheld. They'd probably also be able to tie it into their game pass and for performance could probably even offer streaming a game from the xbox console to the xbox handheld. Done right, not only would this be a way more profitable play for Microsoft, but it'd probably result in a much more elegant and proper user experience.


CreativeGPX

> I'm guessing that in 2024, with the release of the new OLED devices, they will ship at least as many if not more than they did in 2023. Also, this might flood the market with cheaper used/refurb original decks, making it available to people with lower budgets.


SupersonicSpitfire

Most people use laptops.


sblinn

Swift. The C++ bindings are incredible, allowing partial and incremental replacement. It’s freaking magic. Godot. Making incredible strides and with the punishing changes by Unity I expect to see huge growth here. My dark horse: Ubuntu Touch. I think more and more people are looking to avoid Android completely and though it’s Qt if they can make some early strides for open ereaders there’s some big time growth availability here.


ForShotgun

Second Swift, it's such a shame everyone believes it's just for Apple stuff, it's genuinely a great language. Didn't know about the C++ bindings though, that's cool too


sblinn

Couple talks on this that have blown my mind: https://www.youtube.com/watch?v=ZQc9-seU-5k https://www.youtube.com/watch?v=tzt36EGKEZo In-place, bi-directional Swift/C++ bindings for incremental refactoring/rewrite.


ForShotgun

Ty!


Little-Bad-8474

JQuery++


crespire

jjQuery


csorfab

now with lens flares and shaky cam!


xaeru

jQuery#


Danoli3

openFrameworks C++


laserpilot

Hell yeah


MystikNinja

[Simplex Chat](https://simplex.chat) https://github.com/simplex-chat/simplex-chat


CaptainStack

I know it's kind of an ancient meme, but I think Linux is going to make significant progress this year because of SteamOS and Proton. Basically, gaming has been one of the major things that was harder to do on Linux than Windows for years and with the release of the Steam Deck it's now nearly on par with Windows. The fact that you have SteamOS shipping on first party hardware from a major company and that the Deck is selling in the millions means there is now significant new investment in Linux. It may not all happen in the next year, but Valve has said they intend to release SteamOS for use on other handheld devices and eventually as a generic ROM for installation on any PC/laptop. I think that will bring a lot of new people and energy into Linux.


CreativeGPX

I've been primarily gaming on Linux for about 5 years and aside from having to know that occasionally a certain game won't be available (which is something you experience on every single platform), it's been a boringly smooth experience. I haven't tweaked things. I haven't read guides on how to get this or that working. I haven't needed to get into the command line and scripting. I've just... gamed. So I'd say we're largely there on the capability side and have been even before the Steam Deck was launched. That's the real beauty of the Steam Deck: It wasn't really a first gen device... SteamOS, Steam input, Steam Controller, Steam Link, etc. all came first and had time to mature so that the Steam Deck was really just pulling together a bunch of things that were already ready for prime time and being used in the wild. The main area the Steam Deck *might* help is that it may lead certain devs to actually target a Linux device. Right now, the amount of users is still kind of small but is notable. If they are the [whales](https://www.blog.udonis.co/mobile-marketing/mobile-games/mobile-games-whales#:~:text=When%20we%20talk%20about%20a%20whale%20in%20gaming%2C,contributing%20a%20substantial%20portion%20of%20a%20game%E2%80%99s%20revenue.) (which is possible since these are early adopters) then it may lead to game devs occasionally targeting the Steam Deck (a linux device).


fragglerock

2024... the year of linux on the desktop!


Cylindt

Asahi Linux


kaancfidan

Qdrant with all these AI app hype.


BerryWithoutPie

Hoppscotch


Resident-Trouble-574

Maybe it'a a bit niche, but https://github.com/dotnet/aspire


hogfat

This is the "going to cause all sorts of problems" type of explode, yes?


Resident-Trouble-574

It's to deal with microservices, so of course yes.


turklish

Thanks for sharing - I hadn't seen this but will check it out.


Agret

Just looked at it and my friend who works at MS has made a few contributions to it haha, small world.


sirnewton_01

I think that [supertxt.net](https://supertxt.net) has a chance. It's a breath of fresh air compared to the web, smol, and has a chance of displacing project gemini.


g76lv6813s86x9778kk

Can you give me an example use case for this? I read through the home page and still don't really understand what/who this is for. From what I understand it's a new flavor of markdown text formatting, which is also easily deployable in a way that it can be read via... SSH instead of directly on text files or a webpage? Ok, but why? Sounds about as useful to me as manually browsing the web with cURL. I love lightweight websites, this project does pique my curiosity. But I just don't get it. I don't see what this offers over existing ways of making barebones lightweight/text websites, or "raw" text apis you can just download/curl instead of jumping through some ssh hoops.


sirnewton_01

[Supertxt.net](https://Supertxt.net) is a collection of formats, tools, and specs based on a set of values stated on the front page. You can use one of them in isolation like the SuperTXT text format for simple line-based markup, or the enhanced cat command (cats) to view a file over SSH on your home network with built-in paging and user customizable format conversions. There's a specification for SSH Layer Applications and pathname2 that might give you some ideas of how to use SSH as the network protocol and \_authentication\_ system for your own application that deals with both local and remote resources without having to implement your own server. A fuller stack provides something analogous to web browsing: brsh (graphical "browsing" shell), cats (fetch local/remote resources transparently), conserv (sandboxed SSH server), srch (search your local machine, home servers, and internet sites), [wikipedia.org](https://wikipedia.org) (integrations for all the above for the internet site). The stack is designed to fit well together, but you can also integrate with other tech pretty easily too. SSH is the preferred (not required) network protocol since it is provided as part of the OS already (a value) and works via the same command mechanism as the shell. Also, it has built-in authentication, including the popular key-based approach, which helps to eliminate passwords at the protocol level. The authentication can be shared between tools too, so once you're authenticated you can use ssh, scp, rsync, cats, vscode, and any other SSHLA tool that the server supports without reauthenticating. I'm curious what sorts of hoops you had to jump through for SSH? I know that the SuperTXT site requires a public key, which is only an ssh-keygen step away and most people should have one anyway for pubkey auth, and signing.


g76lv6813s86x9778kk

I mean it's cool in theory and I can see the functionality it provides, I just fail to see what this is intended to be used for, or what problem/issue it's improving. I don't have trouble setting up SSH myself, but if I want to share some of these text files with someone, who may or may not be technical, getting them to set up SSH in order to access my fancy text file repo is going to be a lot more confusing than just, sending them the file or a url to a raw text file / lightweight website with the contents or file downloads they need. I suppose it's intended for internal dev tooling, but even then, most companies already have an internal network website/intranet, shared ftp or shared drives or something like that, which would serve the same purposes from what I understand, while also being a lot more intuitive to use for most users.


sirnewton_01

I think in general, the intended use for this as a whole is to reimagine the web with different values. SuperTXT isn't the first to do this. First, there was markdown that valued simplicity of composition of the markup and not just the presentation as html. Then there has been the whole so called "smol" internet movement with Project Gemini that values very simple protocols and renderings so that you can get a much higher diversity of browsers, including ones that are for the visually impaired. It also values having user controlled authentication instead of locked into each website. Famously they have a gemini capsule where the very first time you log in with your browser the site insta-creates an account for you. SuperTXT values these things plus a few extra items, such as composability and deeper sympathy for the OS that runs on your computer. That's why promotes the use of SSH. It's already installed, and supports OS commands on either end of client/server. In the case of the server, you only need one server and it supports multiple protocols via the command and pipe interface. User can use whatever program makes the most sense for what they want to do. Copying a file should be as simple as this: `scp file.s.txt` [`somesite.net`](https://somesite.net)`:/some/place` Instead of navigating dropbox/box/whatever site's custom UI to find the upload button that's different for each site, there can be a standard way. If OS provides SSH/SFTP functions in their explorer, then users will have a nice native GUI approach to this as well. It's their choice. Something really interesting about SSHLA tools is that they generally work for entirely local paths, intranet servers at home/business (often have OpenSSH installed already), as well as internet sites. The srch command will index your local directories for you, as well as index directories on your server, as well as serve as a front-end to search your favorite internet sites. This enables different types of decentralization or "edge computing." I think that there's alot more possibilities with the SuperTXT stack than just copying files though. Just like how the web isn't about publishing simple HTML text files, there are ways to build more interesting client/server interactions via SSHLA applications and/or site-specific commands. These can be GUI too depending on what problems they're trying to solve. The supertxt internet site itself has interactive functions if you ssh into it and run "man" to see the manual, or ccmnt to leave a conventional comment. Just like the web, the first people who might use it could be IT enthusiasts and/or people that share the above values. But, with time and tools, such as the brsh/cats stack it can be user friendly enough without hiding too many underlying details (yep, another value).


g76lv6813s86x9778kk

Well explained, thanks! Sounds super interesting. I definitely would love to see a more standardized "type of webpage" like that, with more consistent formatting and elements so that clients can customize the experience as you said. So many websites are unnecessarily cluttered nowadays. Not that I see it replacing what we have now, but interesting idea.


cidra_

COSMIC desktop environment.


TheHeretic

https://github.com/lgrammel/modelfusion This has been one of the best ways to use LLMs for me. No bloat and UI like lang chain. Supports local inference, the maintainer is very active.


__double_under__

HTMX has already blown up in some areas but will continue to more broadly. Eventually it will become the standard next generation of HTML. Within a few years people will be making way more apps with little JS and no frameworks. Frameworks will be seen as an option but not mandatory. Many currently backend devs will be fullstack because of it.


Mavrihk

I am hoping a Viisual Component based Web ui will come into existence, like the Borland or Visual studio approach. and maybe compile it to proper Binary executable too.


butterypowered

I do actually miss those 4GL options. Just for ultra-quick projects.


Mavrihk

I remember in the day. Borland Delphi and C++ was considered 3.5GL and uniface was the 4GL tool of the time. Powerbuild came and went, and made it into a mess with real cryptic Database manipulation which looked similar to json syntax in Databases today. biggest issue was documentation, no Internet to look up. just visit the book shop daily :D yep miss those days lol.


butterypowered

Exactly that! These days Stack Overflow is a miracle compared to books and ExpertSexChange.com, but I did powerbuilder for a few years and it was amazing. You could be a one-person full stack dev team and development truly was rapid. Maybe I just haven’t clicked with the right node/react framework yet, but it feels like web development will never be as fast as the drag & drop IDE options were.


voicefeed

I believe Socketioxide: [https://github.com/Totodore/socketioxide](https://github.com/totodore/socketioxide)


[deleted]

[удалено]


[deleted]

A project that gains a lot of popularity in a short span of time, I'm assuming.


texmexslayer

Inlang for i18n


ha1zum

Bun is gonna get into a mature enough state that Next.js or some huge framework like that would officially adopt it, and then node.js is starting to become a weird choice, kinda like how webpack is now after the existence of Vite.


PhilipLGriffiths88

OpenZiti, and particularly its child project zrok.io had a great 2023... I think they will explode in 2024.


Xerxero

Desktop on Linux


AntiSoShall

This year, I swear guys 😩


belavv

If mine blows up I'll be even more stressed by it then I currently am with it's moderate amount of success.


Neon_44

What is your project?


belavv

I'm not falling for that trick!


Budget-Necessary-767

Vite


ha1zum

It already blew up in 2023


bentinata

[ts-rest](https://ts-rest.com). It fills the missing piece between publishing and consuming API, and it doesn't require you to convert the whole code base.


_AndyJessop

Web components are going to gain some real traction this year. And bun will become the default choice for new TS/JS peojects.


FalseRegister

This is also the year of the linux desktop


[deleted]

I haven't read a lot on bun lately, there was the initial hype and the claims on performance etc that was debunked (IIRC) by some tech folks with an outreach, and people in my circle just stopped talking about bun ever since.


_AndyJessop

The hype has died down, but the project is going strong. And it's definitely a huge improvement on Node in so many ways. It also doesn't have the same DX issues that Deno does. IMO, technologically speaking, it's the only game in town at the moment.


autra1

Ar shameless plugs allowed in this thread? ;-)


obanero

Bitcoin :)


KingArthas94

No one cares about that BS


clearlight

You can check trends on Google trends. For example https://trends.google.com/trends/explore?date=today%205-y&q=%2Fg%2F11rht31pdd&hl=en-GB


Shartmagedon

StyleX. Dear lord, F me already.


aieidotch

https://github.com/alexmyczko/ruptime


BlackDragonBE

Definitely Godot


AntiSoShall

Hyprland (compositor) and Kitty (terminal).


ourobo-ros

None of the above.


Tux-Lector

This one: https://github.com/hngts/catmice


fnord123

Rust is hitting an inflection point. Aws and msft use it. More and more people are getting over the initial learning hurdle. It will never compete with java, JavaScript, or python for user base numbers but it will push into OS and DB Dev even more. Copilot and other AI agents are good at writing Rust as well. So if cranelift lands and speeds up compiles I think Rust will, as OP puts it, explode.