Це відео не доступне.
Перепрошуємо.
Gesture Volume Control | OpenCV Python | Computer Vision
Вставка
- Опубліковано 29 бер 2021
- In this tutorial, we are going to learn how to use Gesture Control to change the volume of a computer. We first look into hand tracking and then we will use the hand landmarks to find gestures of our hand to change the volume. This project is module-based which means we will be using a previously created hand module which makes hand tracking very easy.
🚀🚀 My Urdu/Hindi AI UA-cam Channel 🚀🚀
/ @murtazahassan01
Download Code:
www.computervision.zone/cours...
Hand Tracking Bare Min Code and Module Video:
• Hand Tracking 30 FPS u...
Gesture Volume Control [Part 2]
• Advance Gesture Volume...
Premium Courses:
✔️ Computer Vision Game Development Course:
bit.ly/3ttLZ2s
✔️ Computer Vision with Arduino Course:
bit.ly/3wzLB4m
✔️ Advanced Drone Programming Course:
bit.ly/3qs3v5g
✔️ Learn to Build Computer Vision Mobile Apps:
bit.ly/3uioY1J
✔️ Jetson Nano Premium Course:
bit.ly/3L8uIlF
⚙️⚙️⚙️--My Gear - ⚙️⚙️⚙️
👉 Complete Gear 💈: www.computervision.zone/tech-...
👉 My PC Specs 🖥️: www.computervision.zone/tech-...
👉 My Video Shooting Gear📽️ : www.computervision.zone/tech-...
👉 My Laptops 💻: www.computervision.zone/tech-...
👉 Educational Products🧑🎓: www.computervision.zone/tech-...
👉 UA-cam Starter Kit 🔴: www.computervision.zone/tech-...
Follow Me:
TikTok: bit.ly/3Vo76OQ
Facebook Group: bit.ly/3irDcb7
Discord: bit.ly/3JvyxAM
Facebook Page: bit.ly/3IvpU7W
Instagram : bit.ly/3NdGME3
Website: bit.ly/3ICFTS0
Github: bit.ly/3woU6PS
#ComputerVision
#OpenCV
#CVZone
You need a medal, because no one on youtube uploads such videos.🙏
that is depressingly not true
@Mynk FF Thank you for your kind words
@@Skisful link for others plz ???
Everyone is busy in predicting house priceeeee
@@Enjoyurlife1789 this is hilarious! 😅😅
What a time to be alive. Only legends will understand 😁
Which IDE is he using? Looks like VS Code thou
@@GenAIWithNandakishor thank you
@@samuelscheit correct
Dear fellow schoolers
Two minute paper those who are confused
People in the future:
'Pass me that bottle'
'i can't, I don't want to mess up my volume'
You're an absolute open source software contributor. Thanks for the effort and the video. Keep going !!
Did you get it this courses?
I see great potential in this idea, especially for phones, perfect self-explanatory code, thank you :)
I am actually suprised that I understood everything that you were doing, and it means you explained it well
Hello Sir, your videos have added so much character and life to my projects. Thank you. I was wondering if you could make a tutorial on full body tracking. Thanks again !
Someone had the same idea back in college but their advisors told them this is a long term project. They still passed but dialed down to ASL interpretation. That's still alot of work done
The best minutes I have spent in the internet!! So interesting video by all means!!
I was watching you everyday and didn't realized I wasn't subscribed thanks !
Thank you very much! Helped me in a gesture detection project. You are a legend
This was an amazing tutorial. I'm having a lot of fun learning these computer vision applications. Thank You @Murtaza's Workshop. See you in AI virtual Mouse Video set tomorrow.
ну ты даешь старичок, годный туториал забабахал, по красоте да то есть
заделал себе такую, теперь перед телками могу понтоваться
ну всё, я погнал, всех обнял
You are lit seriously🔥💥....you are the only youtuber I believe who isn't money minded....you deserve a lot more ❤️❤️❤️
Im subs! This channel is hidden gem to be sure : ), i would share this to my community, thanks for making this tutorial video, very educate alot of people!
The best 36 minutes I've spent on UA-cam
Cool, one small suggestion as the hand distance length is also going to depend upon the distance of the hand from the camera may be this needs to be combined with the hand distance measurement to set the value range instead of hardcoded 50 to 300. Also the volume will continuously vary as we move the hand across. May be we need some way to freeze the volume.
Thank u for creating and sharing this type of content!
This is amazing. You deserve a nobel prize.
Bro seriously
Its awesome
I got much to learn and u teached it seriously in an easier way
Thanks bro
Thanks for your awesome and more importantly "without loophole" tutorial
you are the best teacher , one in a million , thanku helped a lot
That's impressive. I made a few changes where the program calculates the hand's distance from the camera that way when it's far away since the numbers will change, it can perform more adequately
Hi Murat, I noticed the same issue. Since the vol is calculated upon the line length, it also changes when you get closer or far away from the camera keeping the same distance between fingers (index and thumb9. How did you manage to calculate the distance from the camera to introduce a correction to the algorithm?
You are the awesome one who I have ever seen . I can understand all you said. I can't belive my self. You're speeking chrystal clear. brilant. So I think to myself we are speeking the same language. You made me love the OPENCV. Good to see you are, good to here you are. God bless you and those like you.
Interesting, currently learning python and hope to be able to do this one day. You are an inspiration
Have you done it friend?
Did you complete this now?????????????
You are doing well Murtaza. I am benefiting myself by watching you. One day you will get 1000k subscribers.
Murtaza's Workshop - Robotics and AI
- awesomEE!!
You are the best teacher i had seen
This technique will be even better two more papers down the line!
For people working on Linux use:
from subprocess import call
volume = int(volume)
call(["amixer", "-D", "pulse", "sset", "Master", str(volume)+"%"])
Hope that helps!
Thx
Thanks
Great topic, thanks 👍
I'm really blown away by how you explained it 😍😍. Bro which version of pycharm, Python and anaconda are you using?
This is really. helpful. You are genius man keep it up....Thanks alot..
Nice tutorial... But can you make a tutorial showing how to know the positions of the hand (up, down, right, and left)?
this content is goood..
i finished this project now...
Thanks for sharing. It is very comprehensive.
underrated channel
Thank you so much for these tutorials
Dammmmmmmmmmmmm. This is ridiculously cool!
Great job once again buddy
You do such great videos! This is exactly what I was looking for. Thank you!
amazing tutorial. do more like these
pyautogui is also a module which can be used for this. Great video!
thank you man , you are best creature tutorial i see. i kneel in front of you.
You have one new subscriber, thanks for the video!!!!
I looked into your code and highly suggest that you handel the instance when you need to stop adjusting the volumeand save it. I tried it by calculating the distance between the rest fingers to the bottom landmark near the wrist. Good video btw
How, whats the syntax
Thank you for your precious knowledge :)
This is impressive.Congrats
Thanks again, it helps a lot; really usefull.
Btw, I allways write the code as I watch your videos to add my comments to it, but i've seen that code file is wrong uploaded at your blog (it is previous lesson's one)
Amazing work
I'm not able to open the code in this link www.murtazahassan.com/courses/opencv-projects/
please tell me how to access the code
yes it's wrong actually
Hello! can i get the right code?
You are very good teacher!
Awesome , this is what I was looking for, thanks
Great work man...keep it up❤️❤️❤️❤️
Great video, like always..!! THX
Thank you so much for your great video!
You always doing the best because you are the best 😎
This is so freaking cool man!
really nice video, i have learn a lot thank you
Quite nice project! 👏
Bhai gjb ka series ❤️❤️😂
Nice video. Found it very helpful.
Thank you for this fantastic content, can we used this type of repo with kinect v2 device ?
You're a legend ! Thank you
Great tutorial keep it up 💯
To be honest, this is really cool.
very cool video keep it up👍
If you are getting an error getting the x and y value using hand['lmList'][8] then use hand['lmList'][8][:2] instead.
This is because the latest version of cvzone consists of x,y,z values instead of only x,y. So you have to define that you only need the first two elements by adding [:2]
could there be a way to normalize the length based on the distance of the hand to the camera using this z value? Thanks for the vid!
can you please provide handtracking code
first of all thanks sir for this great idea and It is daaaammmmm super cool
you are a very good man, thank you :)
Wow I liked your video.Good job
Truely underrated content....bro you are the best
Great video i just went to the bathroom watching this and stayed in the bathroom until i finished watching this video great explanation! 😊😊😊😊😊😊
Cool tutorial Murtaza! I'm going to implement this myself.
It lacks one thing though: There needs to be a way to confirm the final volume. In 35:45, while moving your hand out of the cam's view, you change the volume from 0% to 60% unintentionally.
I'm not able to open the code in this link www.murtazahassan.com/courses/opencv-projects/
please tell me how to access the code
@@affanskm3530 i did not try it yet, but my guess is that you have to enroll first into the course on his website before you can access the code.
Or just watch the video and cppy it by typinh yourself
Also there is need some kind of normalization, because length range depends on distance from camera to hand
Excelente logré hacerlo está muy divertido e interesante me tiene en shock
Thank you sir for this tutorial.
Your videos are awesome!!!!!!
Ok am subbing u. Worth it.
Thanks for this wonderful video
At the end you should perform a linear to log conversion, because sound scales are logarithmic
yo he dicho lo mismo XD
Thank you, didn't know that. Is that truth for this library or is this common thing?
@@mr_sugarcube It's a common thing (at least in Windows AFAIK)
Great lecture ! 👍👍. The only suggestion I have is pycaw navigates the master volume levels at an exponential level rather than a linear level. Hence, if the master level is reduced by 10 say -10 to -20, the actual volume is reduced by half.
So when the distance between your fingers is half the max distance, the volume is only 2 or 3 (not 50). Hence, the distance can be made proportional not to the master volume but the log of the level. I directly correlated the distance between the fingers to the actual volume level on windows and then exponentiated it to get the master leve that needs to be set. This gives us a much smoother correlation between the fingers distance and the actual volume on Windows
Awesome tutorial ❤️
you'r the best Murtaza
Fantastic, tks for share!!
Thanks for this amazing video
Woow it's amazing 🔥🔥🔥
Amazing tutorial sir
is there a library for ubuntu? I cant make it work with pycaw on ubuntu since its for windows and i dont have windows
This was amazing
Thank you for your effort and great value videos!
Does your computer vision course cover linear algebra math libraries and other similar computer vision libraries?
All this other bloggers making todo apps, congrats on such rich content
Cool video, thanks :)
gave me a glue how to start with y from zero ! Thanks
you are awesome bhai.....
The video is so great and excellent. Say I have 2 screen displays: 1 is my laptop and 1 is my monitor. Usually I have to click the mouse to the display that I want to type or use. Is there any way that we can use gesture to click on the screen that we want? Say number 1 is for the laptop screen, and number 2 is for the monitor.
very cool project nice
Congrate! Very good video brother.
My problem is that pyCaw works only on Windows. I have to look for an other library who works on Linux.
Thank you very much for those tutorials.
mine too found any solution?
Thank you so much!
Well done!
We can also initialize the volBar and volPerc with current system volumes:
curr_vol = volume.GetMasterVolumeLevel()
vol_perc = np.interp(curr_vol, [-65, 0], [0, 100])
vol_bar = np.interp(curr_vol, [-65, 0], [400, 150])
Hope it helps....
Buddy can you help me with this if I am working on mac, how to get volume details on mac, on windows it is being done by pycaw.
I can only print hello world and i am here!😇
I need this motivation and confidence 🤗😀😂