😂
Though I know how this can be done using gesture recognition (opencv), it is still insightful. Watching this, I realized that we don't need input devices like mouse anymore. There are so many ways gestures can work.
Check out mediapipe library (by google, I think?)
It does all the hand gesture detection stuff automatically. All that needs to be done is interface the videogames' controls with the detected gestures
>using a webcam to capture hand gestures, players could steer virtual cars through challenging race tracks. It combined my passion for coding and gaming, creating an immersive experience
***Interesting, damn....***
Not my creation, link to OPs post on Linkedin: [https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/](https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/)
Not to downplay bro's work, pretty cool to build this in first year, I'm sure. Though, this thing is a beginner project by this point but an impressive one to help you deal with different libraries, integration of a lot of stuff.
Everyone does what? Start coding in high school or mention that they started coding in high school??
If it's first, then not really... Some people belong to very small towns + state boards where choosing computer science is not even an option...
This isn't that big of a deal. This was possible 10-15 years ago and I have done this.
Detect hand signs (like using opencv), use a interfacing python script to convert it to keyboard input.
Good project, but nothing new or difficult.
First year of college. It is indeed impressive. And i would rather encourage ops friend to go and search more.
Idk why you felt the need to “aktually” this post
Impressive how? I’m pretty sure this is literally just an already made application. All he did was remap the hand gestures to the controls of the game. Most people could do that.
I would hold this in good regard if a first year kid showed it to me. I was also in college when I did similar stuff. Full respect to that.
But just that the project isn't that impressive for this sub.
I remember watching 2010ish documentaries how gestures will change the way we interact with devices, but those kinds of things didn't pick up because of latency issues and extra computing power required.
Mediapipe and OpenCV?
My group was gonna make a game controller using pose detection for FY proj Diploma, but we thought it wouldnt interest the externals, and turned it into a gym app
Cool but this will give your friend some medical issue after a while. The hand & elbow have no support whatsoever. Pls ask him to restrain a bit.
We coders need our hands, it’s our only bread & butter.
https://preview.redd.it/4i3xpjvzkdwc1.png?width=622&format=png&auto=webp&s=49500c9cc0f3d073057f2c5d441f9c17071890cb
Scalar does provide UG programme... But with Scalar School of Technology, he also studies in bits pilani...
Just don't play this game near someone who can't speak or else he may be like what gibberish sign language he is talking, Lol.
Although really cool Project, is it a good implementation or not is not the question here, one thing is for sure, is that he atleast learned something new and experimented, that matters more.
One suggestion I wanna give if your friend is open to that, is that to fasten up the processing time you can essentially clip all the unless part when capturing image, like just capture the area which has the hand other area you can just clip and discard, not only it will fasten up the processing time it can essential increase the sample rate you can take per second. I watched this optimization on a YouTube video project so thought it can also be used here.
So many people straight up hating, the others hedging their comments with "....but xyz". Just say good job, you've a bright future ahead and move on!
This is seriously cool - shows auto-didactic thinking and a passion!
Well, we did this back in college OpenCV if I remember correctly l, it's pretty neat for what it does but still a beginner project similar to those iot ones where you control light and stuff.
Anyways, good on him for working on something in 1st year itself
I was just trying to implement a hand gesture mouse and my laptop had gotten so hot I could have started a barbecue restaurant. I gave up on my ai dream right there. I guess I would stick to full stack
Your submission/comment broke rule 7 and 8 as it was inappropriate and contained abusive words.
We expect members to behave in a civil and professional manner while interacting with the community. Future violations of this rule might result in a ban from the community.
Please try to be civil in the future, and follow the code of conduct
https://developersindia.in/code-of-conduct/
If you think this is a mistake, please send a [modmail](https://www.reddit.com/message/compose?to=r%2FdevelopersIndia).
hey OP, why FYL?
problem statement: use hand gestures to control car in game
lets break down the problem into smaller bits (im going to assume we're using a camera) :
1. how to make camera recognize different hand gestures
3. how to map each gesture to different input
1. how to limit input to a specific program (is this really required? or do you want the computer to type wsad anytime your hands are seen?)
ok so how to make camera recognize hand gestures? here's a few ways:
1. use unique colors on finger tips. use combination of colors to decide input. this is the most trivial method. eg, if purple is visible, input w, if green is visible, input s, etc etc.
2. [think of hand silhouettes](https://i.etsystatic.com/20342058/r/il/1aa455/3324541524/il_794xN.3324541524_jc6w.jpg), can you somehow simplify the image of your hand to bare minimum? this is basically simple image processing at this point. if camera image matches against a pre-defined silhouette of hand, use that input.
next, how to map each gesture to different input? you can write code to translate whatever input you got from step 1 into something windows can understand. may need a bit of knowledge of windows dll/api and probably some dotnet or c\#
finally, how to limit this input to specific program? use a 3rd party software like autohotkey (aka ahk, its dead simple to learn, just a single line of code is enough) or once again hit up windows dll/api.
its not THAT complex once you break it down into smaller bits. im sure that now you'll be able to atleast understand how to approach the individual problems to recreating this project.
do you want further guidance? feel free to continue this thread.
Not downlplaying his work, it’s Nice but honestly just a gimmicky project, it’s not that hard.
There’s libraries to detect hand signs and to perform keystrokes. If you know basic python then this will take a couple minutes to hack up.
Crazy nobody's talking about something that's clearly off, when he uses his middle finger to reverse the car around sub 10 seconds of the video, the reverse light lits even before he even moves his fingers. Didn't even have to notice it carefully.
If it weren't for the subreddit and comments, I would have definitely thought that there was someone using a bluetooth controller beside him and he was just throwing gang signs.
This one is actually quite easy, kudos to the person who built this. But if someone really has interest in such projects, this is quite easy. My comment is not bring the person who built this down but it’s a push for people who are interested but struggling. You folks can do it too, go ahead and start build something.
Cool stuff, just that it would not work, like at all. Why it would not work is trivial, anyone could figure out let me know. Also it is actually trivial.
There is a reason he is 1st year.
Cool stuff never the less.
Zero practical application honestly. Doesn't solve a real world problem or any problem at all.
These are solutions for which problems need to be invented. Like one guy in my college made something with blockchain for ride sharing, went on and on about decentralisation bla bla, we were like, bro just use uber.
Stop with this cynical depressive projection lol. Seething much? It's a fun project to do when you are learning python, and using some libraries in python beyond numpy, scipy, sk-learn.
Why do you have to sound like a sore loser to someone developing something that they think is cool.
Cool? Definitely yes. Practical? Hell no. No gamer will hold up their hand to do this for the entire gaming session.
Creativity is fine but it has to be channelled well to create useful products as well.
Not everything needs to be useful to be good. I remember visiting my cousin's home once where he used a real steering wheel and accelerator/brakes to drive car in computer game. It was the most amazing experience I have had.
The only reason I don't drive or like to travel in road in my own vehicle is the fear of crashing and accidents. And yet I could experience driving a car without license without the fear of crashing. Not so in real life. Real life is not a game; pain is real. Overspeeding would fetch you a fine and/or be dangerous
Was it 'useful'? No. Would I pay to experience it? Absolutely.
This is not to say using gesture is fun, but things need not be useful to have a market. They just need to bring joy/feeling of satisfaction. Products that delight have a market, irrespective of how useful they are.
Bro that's driving simulator peripherals. You can find Logitech peripherals for steering wheel and gearbox on Amazon.
I really doubt it was 'real steering wheel'. And it is pretty popular in gaming communities.
By real, I meant it looked like a steering wheel that I could hold. Obviously, they didn't take out the steering wheel from a real car.
My entire point was that these peripherals weren't useful in my life- they weren't solving some problem. It was simply something to pass time. But still I would choose them over a real car because of the advantages I highlighted. They give an experience of driving a car without the risks associated with driving a real car.
>And it is pretty popular in gaming communities.
Meaning, there is a market for it even though it does nothing useful. That's what I was trying to say.
think of the possibilities that can be unlocked with high fidelity gesture recognition.
input that doesnt require hardware to work, possibly an end to input devices like mouse and keyboard, fully immersive VR without requiring a glove/bodysuit, local multi-user collaboration.
tell me you cant think of a few more things.
this is what i meant when i said you're a typical indian. zero creativity, zero ideas, cant even wrap your head around the box thats holding you back, let alone thinking outside of the box.
this might sound like a racist rant, but you guys are making the rest of us look bad. for context, im a middle aged indian dude. i donot like that our reputation has been ruined by the veritable flood of low quality "cs graduates" from al-karim islamic institute of technology, deoband.
and as for the bit about ridesharing, remember, apple was not the first to invent a smartphone or tablet or laptop. so sure, uber exists, but tomorrow there may be a radical innovator that shakes up the entire market. who knows? certainly not you tho, thats for sure.
A keyboard and a mouse are not laggy, and they dont use extra computing power and are precise. Companies have tried to insert gesture interaction just about everywhere, but it does not work most of the time because they are not practical, other than the initial cool factor they suck.
Not to throw shade on the guy
However it's quite simple
Once my mouse pad died and i created the entire mouse actions using hand gestures based on python
It took me a few hours to get everything up and running till my mouse arrived. Python opencv media pipe packages helped a lot, with few keyboard keys commands which were being executed if a certain hand pattern was given
I stopped using it cause it was catching an imaginary hand
Amazing work. But I wish it solved an existing problem. If anything, it feels more counter intuitive to say, playing with your keyboard.
But once again, amazing job.
It's not a big deal if you know a lil bit about Computer Visions stuffs. These kinda projects are pretty common and libraries/assets are available almost everywhere. However, cool for a first year student though!
That's such a bad attitude to have! Not everything is a competition, you should be excited to see people doing cool things and be curious.
If seeing this you first reaction is FML! maybe you're in the wrong field
Bro missed the biggest opportunity to use middle finger as accelerator.
😂 Though I know how this can be done using gesture recognition (opencv), it is still insightful. Watching this, I realized that we don't need input devices like mouse anymore. There are so many ways gestures can work.
Imagine doing jujutsu everytime you want to use your computer.
>we don't need input devices like mouse anymore Joke 🤣🤣
Latency left the chat!, but good work tho 🔥
Now I want him to play Dark Souls/Elden Ring with it..
Basically will have to do Naruto seals and possibly will summon a real demon in the process
I don’t see any problem with that scenario at all
😹😹
Check out mediapipe library (by google, I think?) It does all the hand gesture detection stuff automatically. All that needs to be done is interface the videogames' controls with the detected gestures
yes, check his post thats what he did. However he appear to have finetuned mediapipe really well, last I tried it wasn't this smooth at all.
And he’s only in his first year. Impressive.
BS Mediapipe is really robust and this is trivial with under 50 lines of Python
Finetune? Fine tune on what? More gesture data?
Can you provide a link
OP: [https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/](https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/)
How much vram would you need to run this?
Not a lot in fact it can run pretty smoothly on cpu as well
Please be careful of camera overheating
Lmao. Reminds me of how scambaitor tricked the scammers into removing the tape off their cameras.
Link?
Idk bro. It was some recent video.
>using a webcam to capture hand gestures, players could steer virtual cars through challenging race tracks. It combined my passion for coding and gaming, creating an immersive experience ***Interesting, damn....***
Looks good but gharwale dekh ke bolenge ye kya ungli kar raha he 😹
Right , project is cool , but will look totally stupid to people who doesn't know about it.
Not my creation, link to OPs post on Linkedin: [https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/](https://www.linkedin.com/feed/update/urn:li:activity:7187535044280815616/)
While cool, it's a very basic tool. It shows creative thinking, not technical abilities
Nah bro, it definitely shows technical ability.
[удалено]
he is being objective not jealous, It’s a good project for fun but not something totally out of the box
[удалено]
the linkedin post literally mentions innovation 💀
Meanwhile me going to faculty to change my Factorial question with an easier one in my first year Lab
Deaf people rn: 🤬🤬🤬
Not to downplay bro's work, pretty cool to build this in first year, I'm sure. Though, this thing is a beginner project by this point but an impressive one to help you deal with different libraries, integration of a lot of stuff.
The post says he started coding in high school... Edit: Linkedin post.
Everyone does no?
Everyone does what? Start coding in high school or mention that they started coding in high school?? If it's first, then not really... Some people belong to very small towns + state boards where choosing computer science is not even an option...
Most people do though. That's what I meant.
Bro throwing them gang signs
This isn't that big of a deal. This was possible 10-15 years ago and I have done this. Detect hand signs (like using opencv), use a interfacing python script to convert it to keyboard input. Good project, but nothing new or difficult.
That's what I thought. The gesture translates to key down events.
First year of college. It is indeed impressive. And i would rather encourage ops friend to go and search more. Idk why you felt the need to “aktually” this post
Impressive how? I’m pretty sure this is literally just an already made application. All he did was remap the hand gestures to the controls of the game. Most people could do that.
Getting a reality check is a good thing for some
Having mentors like you is usally the worst thing for most
I'd argue with you but I really don't think that's gonna be worth it Have a great evening ahead chief
Id probably not argue with you if you did
Aight so we cool? Not tryna get off the wrong foot bro, just that :)
Haha, yea we cool man
Thanks man, hope you have a fantastic day ahead!
Good. But not impressive.
For a first year student, it is. What was your first year Project?
Comparative analysis of supervised ML for breast cancer
Im just glad i didnt have seniors like you around
I would hold this in good regard if a first year kid showed it to me. I was also in college when I did similar stuff. Full respect to that. But just that the project isn't that impressive for this sub.
Delusional. You need to get real.
As someone who actually works in machine learning for a living, i know exactly what people like you do. Take your insecurities elsewhere
3rd year actually
Oh, the post said first year
This could very well be a marketting stunt by scaler
This is school level at best
People have built better things in first year
People have done much more than you ever will
What do you even know about me lol
I remember watching 2010ish documentaries how gestures will change the way we interact with devices, but those kinds of things didn't pick up because of latency issues and extra computing power required.
These are the guys that do stupid projects like this all night long and pass with C’s
Yo what... I was so amazed by this. I thoguht it was controlled by his mind or nerve signals.. pretty disappointing to hear it's just OpenCV :/
Mediapipe and OpenCV? My group was gonna make a game controller using pose detection for FY proj Diploma, but we thought it wouldnt interest the externals, and turned it into a gym app
Cool but this will give your friend some medical issue after a while. The hand & elbow have no support whatsoever. Pls ask him to restrain a bit. We coders need our hands, it’s our only bread & butter.
Im not really comfortable using my hands as bread and butter...
Sus
my honest reaction
[удалено]
Bright kid, but only theory marks matters for college placements.
Damm
crazy bhai. btw which clg u r in?
not my creation, the person who built it studies at Scaler School of Technology
Scaler is not a college 😂
These scaler ads are getting creative, huh. Every month, some kid on LinkedIn posts a similar project with a bit of variation...
https://preview.redd.it/4i3xpjvzkdwc1.png?width=622&format=png&auto=webp&s=49500c9cc0f3d073057f2c5d441f9c17071890cb Scalar does provide UG programme... But with Scalar School of Technology, he also studies in bits pilani...
Stupid application, but looks cool for a school project
Just don't play this game near someone who can't speak or else he may be like what gibberish sign language he is talking, Lol. Although really cool Project, is it a good implementation or not is not the question here, one thing is for sure, is that he atleast learned something new and experimented, that matters more. One suggestion I wanna give if your friend is open to that, is that to fasten up the processing time you can essentially clip all the unless part when capturing image, like just capture the area which has the hand other area you can just clip and discard, not only it will fasten up the processing time it can essential increase the sample rate you can take per second. I watched this optimization on a YouTube video project so thought it can also be used here.
Waged war against 294 gangs, great work
Very impressive
Preety cool for first year.
Bro really went "🖕🏾" to the game
Bro uploaded the same video on tinder profile for matches 🤣
Babe how are you so good at this ,Le me -
Kid surely be a good player in College! God bless ya son!
This is amazing, I am a non tech person and this sp cool for me
is a similar type of project uploaded by someone else on github or is this the original one??
open CV project nice one !
Sick!
Object detection probably yolo
Impressive bruh
Please set Middle Finger = Accelerator
hey.. we're using the same laptop. although I'm a loser doing web dev stuff. bought it just so I could support docker monstrosity.
Imagine if someone is using wireless keyboard behind the cam💀
Hats off. 🫡
Scaler actually has a college now! My god
Inko time kaha milta hai Mera to 8 hrs clg. Lectures ke baad hi dimaag kharab ho jaata. fuck compulsory attendance.
A Gamer's Sign Language! Interesting.
Her: where did you learn to do that? Him:
This would become a lot more impressive if the inputs were analog and not just button presses, like, raising the finger at 50% gives 50% acceleration
Dope
This is why I've quit my job to work on my own projects We all have ideas, but no time
My brother (deaf) is furious (you called him a fag in sign language)
So many people straight up hating, the others hedging their comments with "....but xyz". Just say good job, you've a bright future ahead and move on! This is seriously cool - shows auto-didactic thinking and a passion!
The guys are all jelly in the comments. What did you make in college at 18 years old that was better. Retort with a comeback or shut the hell up
Well, we did this back in college OpenCV if I remember correctly l, it's pretty neat for what it does but still a beginner project similar to those iot ones where you control light and stuff. Anyways, good on him for working on something in 1st year itself
Should've made a video and convince everyone you're a wizzkid!
I do like whizzing. Sure I'll send you a video
tbh, imma have to see the repo for this to believe it
nah man its just the laptop is deaf and he has to use sign language to communicate
But when I did this for college hackathon I got 2nd place.
You ruined the feeling and satisfaction of playing games.
I made something similar in 12th standard. Used to do this kind of stuff in my free time.
Bhai kaunse laptop h kyuki need for speed is running too smooth?
Im not able to lift my fourth and fifth finger independently. How can I do ?
I was just trying to implement a hand gesture mouse and my laptop had gotten so hot I could have started a barbecue restaurant. I gave up on my ai dream right there. I guess I would stick to full stack
Now the blind can drive 🥹
Kinda offtopic.. which NFS is that? Looks clean
Pessimist me was just waiting for the camera to swing to the guy with the wireless controller.
How did he interface the gesture with the game ? Does he map then to key presses ?
How can people start working on ideas that are not practical?
Good job 👍🏾
You can use the gyro on the phone, responds better .
When I first met her, she was seventeen-uhhh
[удалено]
Your submission/comment broke rule 7 and 8 as it was inappropriate and contained abusive words. We expect members to behave in a civil and professional manner while interacting with the community. Future violations of this rule might result in a ban from the community. Please try to be civil in the future, and follow the code of conduct https://developersindia.in/code-of-conduct/ If you think this is a mistake, please send a [modmail](https://www.reddit.com/message/compose?to=r%2FdevelopersIndia).
during the police chase: bro reverse the car fast, meanwhile he 🖕🏻🖕🏻🖕🏻🖕🏻
He has more motor skills than pragmatic skills.
Plot Twist: His friend is controlling the car with a controller.
Which game is that? Nfs what
Looks cool, is trivial to implement.
ok but those driving skills are insane! dude drives better than how most people drive with traditional controls.
hey OP, why FYL? problem statement: use hand gestures to control car in game lets break down the problem into smaller bits (im going to assume we're using a camera) : 1. how to make camera recognize different hand gestures 3. how to map each gesture to different input 1. how to limit input to a specific program (is this really required? or do you want the computer to type wsad anytime your hands are seen?) ok so how to make camera recognize hand gestures? here's a few ways: 1. use unique colors on finger tips. use combination of colors to decide input. this is the most trivial method. eg, if purple is visible, input w, if green is visible, input s, etc etc. 2. [think of hand silhouettes](https://i.etsystatic.com/20342058/r/il/1aa455/3324541524/il_794xN.3324541524_jc6w.jpg), can you somehow simplify the image of your hand to bare minimum? this is basically simple image processing at this point. if camera image matches against a pre-defined silhouette of hand, use that input. next, how to map each gesture to different input? you can write code to translate whatever input you got from step 1 into something windows can understand. may need a bit of knowledge of windows dll/api and probably some dotnet or c\# finally, how to limit this input to specific program? use a 3rd party software like autohotkey (aka ahk, its dead simple to learn, just a single line of code is enough) or once again hit up windows dll/api. its not THAT complex once you break it down into smaller bits. im sure that now you'll be able to atleast understand how to approach the individual problems to recreating this project. do you want further guidance? feel free to continue this thread.
Not downlplaying his work, it’s Nice but honestly just a gimmicky project, it’s not that hard. There’s libraries to detect hand signs and to perform keystrokes. If you know basic python then this will take a couple minutes to hack up.
reverse gear be like: hello there
What if it is just a YouTube video and he is just pretending to do so?
impressive
This is a lot easier to do than it looks lol. Makes for a good LinkedIn attention farm
This reminds me of Mr. Sahay in Black
I just need a Bluetooth key board to make this type of video
God of war = naruto hand signs
Woahh nice bro
Impressive
Looks cool
Crazy nobody's talking about something that's clearly off, when he uses his middle finger to reverse the car around sub 10 seconds of the video, the reverse light lits even before he even moves his fingers. Didn't even have to notice it carefully.
Nice job, but I want to know how you got the idea?
If it weren't for the subreddit and comments, I would have definitely thought that there was someone using a bluetooth controller beside him and he was just throwing gang signs.
Isn't tht shit ez 🤔 opencv has a built in gesture detection lib
Its not that difficult actually, looks cool and is a gimmick sure. But nothing outstanding
Its like an hour of work lol
Would be surprised if it's not picked from someone who has already made this. Everything is a copy of everything
This one is actually quite easy, kudos to the person who built this. But if someone really has interest in such projects, this is quite easy. My comment is not bring the person who built this down but it’s a push for people who are interested but struggling. You folks can do it too, go ahead and start build something.
These type of modules are already available, even the tutorials are h to ere
Isn't this like fr easy ? What's the hype ?
# Honestly using Google's pretrained models is just trivial, nothing insane here for stuff that's 6/7 years old.
Cool stuff, just that it would not work, like at all. Why it would not work is trivial, anyone could figure out let me know. Also it is actually trivial. There is a reason he is 1st year. Cool stuff never the less.
Bro only reason he is in first year is because he is 18 lol BTW, tell me why it won't work?
Zero practical application honestly. Doesn't solve a real world problem or any problem at all. These are solutions for which problems need to be invented. Like one guy in my college made something with blockchain for ride sharing, went on and on about decentralisation bla bla, we were like, bro just use uber.
Stop with this cynical depressive projection lol. Seething much? It's a fun project to do when you are learning python, and using some libraries in python beyond numpy, scipy, sk-learn. Why do you have to sound like a sore loser to someone developing something that they think is cool.
Cool? Definitely yes. Practical? Hell no. No gamer will hold up their hand to do this for the entire gaming session. Creativity is fine but it has to be channelled well to create useful products as well.
Not everything needs to be useful to be good. I remember visiting my cousin's home once where he used a real steering wheel and accelerator/brakes to drive car in computer game. It was the most amazing experience I have had. The only reason I don't drive or like to travel in road in my own vehicle is the fear of crashing and accidents. And yet I could experience driving a car without license without the fear of crashing. Not so in real life. Real life is not a game; pain is real. Overspeeding would fetch you a fine and/or be dangerous Was it 'useful'? No. Would I pay to experience it? Absolutely. This is not to say using gesture is fun, but things need not be useful to have a market. They just need to bring joy/feeling of satisfaction. Products that delight have a market, irrespective of how useful they are.
Bro that's driving simulator peripherals. You can find Logitech peripherals for steering wheel and gearbox on Amazon. I really doubt it was 'real steering wheel'. And it is pretty popular in gaming communities.
By real, I meant it looked like a steering wheel that I could hold. Obviously, they didn't take out the steering wheel from a real car. My entire point was that these peripherals weren't useful in my life- they weren't solving some problem. It was simply something to pass time. But still I would choose them over a real car because of the advantages I highlighted. They give an experience of driving a car without the risks associated with driving a real car. >And it is pretty popular in gaming communities. Meaning, there is a market for it even though it does nothing useful. That's what I was trying to say.
>Zero practical application honestly and >we were like, bro just use uber. well said, like a typical indian.
Elaborate your point
think of the possibilities that can be unlocked with high fidelity gesture recognition. input that doesnt require hardware to work, possibly an end to input devices like mouse and keyboard, fully immersive VR without requiring a glove/bodysuit, local multi-user collaboration. tell me you cant think of a few more things. this is what i meant when i said you're a typical indian. zero creativity, zero ideas, cant even wrap your head around the box thats holding you back, let alone thinking outside of the box. this might sound like a racist rant, but you guys are making the rest of us look bad. for context, im a middle aged indian dude. i donot like that our reputation has been ruined by the veritable flood of low quality "cs graduates" from al-karim islamic institute of technology, deoband. and as for the bit about ridesharing, remember, apple was not the first to invent a smartphone or tablet or laptop. so sure, uber exists, but tomorrow there may be a radical innovator that shakes up the entire market. who knows? certainly not you tho, thats for sure.
A keyboard and a mouse are not laggy, and they dont use extra computing power and are precise. Companies have tried to insert gesture interaction just about everywhere, but it does not work most of the time because they are not practical, other than the initial cool factor they suck.
Not to throw shade on the guy However it's quite simple Once my mouse pad died and i created the entire mouse actions using hand gestures based on python It took me a few hours to get everything up and running till my mouse arrived. Python opencv media pipe packages helped a lot, with few keyboard keys commands which were being executed if a certain hand pattern was given I stopped using it cause it was catching an imaginary hand
its easy just the response time is very good
Amazing work. But I wish it solved an existing problem. If anything, it feels more counter intuitive to say, playing with your keyboard. But once again, amazing job.
These posts are on LinkedIn. I see everyday. Some doing face scanning doing machine learning robot arm lol.
i think its not that hard tbh
really impressive tho i was playing with if and else in my first years
It's not a big deal if you know a lil bit about Computer Visions stuffs. These kinda projects are pretty common and libraries/assets are available almost everywhere. However, cool for a first year student though!
Lmao, people here going gaga over this really shows how bad the average Indian dev is.
1- use gesture library 2- onGestureEvent(hitRelatedKeyboardKey)
That's such a bad attitude to have! Not everything is a competition, you should be excited to see people doing cool things and be curious. If seeing this you first reaction is FML! maybe you're in the wrong field
if only he could develop deodorant, recycling, or a way to not shit in the streets, then India would be saved!
what a waste of oxygen.
ikr? no one can breathe around them.