- 37
- 28 517
High Definition
Приєднався 26 чер 2020
We are First Tech Challenge (FTC) Team 18225 High Definition, a veteran robotics team from Bellevue, Washington composed of 11 enthusiastic students in grades 7-11. Our mission is to promote STEM and FIRST within our community to inspire the next generation of innovators to solve the challenges of the future! We spread mechanical, programming, business, and other practices within the community to ensure their success. Check out our channel for interesting videos and our Connecting with Professionals series for the Power Play 2022-2023 season!
Introduction to Onshape: Learning Computer-Aided Design
This is a part of a larger event FTC 18225 High Definition hosted, named "Building STEM and Other Key Capabilities with Fun: Our Story as an FTC Team and Opportunities with FIRST."
Переглядів: 185
Відео
Alumni Success Stories: From FIRST to College
Переглядів 195Рік тому
This is a part of a larger event FTC 18225 High Definition hosted, named "Building STEM and Other Key Capabilities with Fun: Our Story as an FTC Team and Opportunities with FIRST."
Introduction to Computer Vision in FTC Java Programming
Переглядів 964Рік тому
This is a part of a larger event FTC 18225 High Definition hosted, named "Building STEM and Other Key Capabilities with Fun: Our Story as an FTC Team and Opportunities with FIRST."
Two Segment Suction-Cup Arm
Переглядів 3372 роки тому
Hey! Watch this video to learn controlling a two segment arm using a belt. Additionally, learn how to use a suction cup as an intake and delivery system. Hope you learned something :) Email: ftc18225@gmail.com Website: ftc18225.everstem.org/ GitHub: github.com/HiiDeff/
Balanced Crane with Motor
Переглядів 2282 роки тому
Hey! Watch this video to learn about using motors and gears to balance an arm and drive an angle change. Hope you learned something :) Email: ftc18225@gmail.com Website: ftc18225.everstem.org/ GitHub: github.com/HiiDeff/
FTC 18225 HIGH DEFINITION | FREIGHT FRENZY REVEAL
Переглядів 3,7 тис.2 роки тому
FTC 18225 HIGH DEFINITION | FREIGHT FRENZY REVEAL
FTC 18225 HIGH DEFINITION | FREIGHT FRENZY TRAILER
Переглядів 3 тис.2 роки тому
FTC 18225 HIGH DEFINITION | FREIGHT FRENZY TRAILER
FTC 18225 High Definition WA State Control Award Video Submission (Freight Frenzy 2021-2022)
Переглядів 14 тис.2 роки тому
FTC 18225 High Definition WA State Control Award Video Submission (Freight Frenzy 2021-2022)
FTC 18225 2021-22 Episode 1 - The Duck Story
Переглядів 6973 роки тому
FTC 18225 2021-22 Episode 1 - The Duck Story
FTC 2021-2022 Freight Frenzy Game Animation (no intro)
Переглядів 1733 роки тому
FTC 2021-2022 Freight Frenzy Game Animation (no intro)
FTC 2021-2022 Freight Frenzy Game Animation
Переглядів 2833 роки тому
FTC 2021-2022 Freight Frenzy Game Animation
18225 High Definition WA State Match 6
Переглядів 2293 роки тому
18225 High Definition WA State Match 6
FTC 18225 High Definition WA State Control Award Video Submission
Переглядів 2,1 тис.3 роки тому
FTC 18225 High Definition WA State Control Award Video Submission
HD 18225 WA Feynman Interleague Match 2
Переглядів 1273 роки тому
HD 18225 WA Feynman Interleague Match 2
18225 High Definition Compass Award Nomination - Jason Qiu
Переглядів 973 роки тому
18225 High Definition Compass Award Nomination - Jason Qiu
FTC Team 18225 Feynman Interleague Control Award Submission
Переглядів 2633 роки тому
FTC Team 18225 Feynman Interleague Control Award Submission
Good job on keep it up 😜😜
that claw really has aimbot
Hi, what wheels are those?
they are mecanum wheels with different dimensions then the standard gobilda ones that we actually found on ebay!
ᑭяỖmo𝓼𝐦 🤘
opensky-network.org/ is the website Professor Peng used to get a live position on places around the world!
Awesome video! Loved how you explained the mechanism :)
How was the suction triggered? I don't seen an air line or anything evacuating the air from the suction bulb.
Nope, it isn't triggered! It auto suctions a specific surface type (smooth, flat) and you need to bend the stick to release it.
DANG thats amazing! Both your software and hardware just top notch! I have a quick question, what motors did you all use for your turret?
Thanks! We use a GoBilda yellow jacket motor
Hey something you should double check on is be careful that the suction cup isn't ruled as a closed gas device, since that isn't competition legal. May wanna consult an official on this.
Hey, thanks for pointing that out! That's a really good thing to know for anyone planning on using the suction cup on an FTC robot :))! Thanks again!
How did you create the rotating platform?3d Print? Custom Made from any material? Is it using servo or motor to rotate?
Cool question! Our turret is a lazy susan bearing powered by a motor. There is a carbon fiber plate attached on top of the lazy susan bearing so that we can easily attach our delivery arm. We also have another lazy susan bearing on the intake arm but that is custom 3D printed.
most advanced clawbot ever made
For our past Connecting with Professionals events please head over to ua-cam.com/channels/Kenlz_JmUq6KS9YrkgNT8g.html !
Hello! We apologize for the connection difficulty we had with Dr. Chai. We didn't get a chance to ask him these questions because of the connection issue, but here they are! Here is a brief explanation from him for some of the questions asked. If there are any more questions, you can email us at ftc18225@gmail.com or message us on Instagram at @ftc18225. Are there any disadvantages to deep learning and non-deep learning? - There may be overfeeding. Things are based on predictions and it can mess them up. Another disadvantage is that the prediction may be unexplainable--for example in medical instances when they find tumors or things like that, they need an explanation. Which layer of learning do you think is the most important? - Definitely the prediction layer. It goes very deep and is overall more complex. The convolution layer is more commonly used and generally more simple than the prediction layer.
Did you run your vision in a separate thread? How long did it take to get a picture and fully process it. Did you attempt to find the real world coordinates of the freight? If so how accurate was that?
How does your wall odometry work?
hi! since there were pipes on the field, adding typical odometry would slow down our robot on the field. therefore, since our robot largely remains on the wall, we decided to measure these ticks with an encoder that tracks the rotation of a wheel. we added a spring within the walldometry so that we could overcome uneven surfaces. we used these tick measurements to improve the accuracy and reliability of our movement. let us know if this doesn't make sense! :)
WOW this is an amazing video and robot!!!!
I love your design so much!!! Could you please do a more in depth explanation of your robot or share its CAD file?
I agree with Andrei here, this is awesome and I would love to see a more indepth look!
reveal after worlds?? pro move
Awesome! We were bummed that we didn't have a chance to stop by your pit at Worlds.
Amazing! How does ur arm logic work? can i see code pls
Lol!!
BEST. ROBOT. REVEAL.
The wall odometry is genius
The coolest one so far
gl erik
This is some crazy vision I was wondering if you knew where I could go to learn more about computer vision/tensorflow and or opencv, like commands and how to really understand an utilize the vision. I guess what I'm asking is how did you learn to use tensoflow lite and what I could to be able to master it like you. Thanks a lot :)
Hi! We learned TensorFlow Lite basically throughout trying it out and experiencing what it was capable of. There's plenty of helpful resources in the external samples in the FtcRobotController folder, for example this one: github.com/FIRST-Tech-Challenge/FtcRobotController/blob/master/FtcRobotController/src/main/java/org/firstinspires/ftc/robotcontroller/external/samples/ConceptTensorFlowObjectDetection.java We've found that TensorFlow works well for game elements (a new TFLite model is provided every year) but we've found it difficult to use for custom objects or other use cases like warehouse freight detection. This is why we decided to create a custom vision algorithm, which you can see at 0:33. If you want to learn more about our custom vision algorithms, feel free to contact our lead programmer at null_awe#0184 on Discord. There are probably great videos on how to use OpenCV for FTC on UA-cam, but we haven't really used OpenCV, at least not yet.
Wow! Amazing! I'm definitely going to try to implement that kind of intelligence in our bot.
Feel free to reach out if you have any questions!
@@highdefinition6017 actually, can you give me an idea of where to start? Maybe some theory, what to Google,, etc
@@yotamdubiner2545 Try looking at the sample class in external samples called "ConceptWebcam". This teaches you how to retrieve image frames from the webcam directly. Then, the most important thing is converting the RGB color format of the pixels to HSV, which you can work more with (filtering is much easier with HSV).
@@highdefinition6017 I know how to do that. I'm familiar with eocv. I just need to know how you determine the distance from the object by using the pixel of it in the image
@@highdefinition6017 and how u determine the angle
ua-cam.com/video/Jsez0YlsYLI/v-deo.html Hi Guys look 17728 FTC.
One of the best robots I've seen this season. I don't see the link to your portfolio. Can you share it ?
Hi there! We have not made the portfolio public yet as we are still mid-season. Following the World Championships, we will most likely make it public - so stay tuned for then!
@@highdefinition6017 Have you made the code public? You mentioned in the video that you were considering doing that.
sick <3
Hello, tell me please. You are using a servo dymamixel 12a? If so, please tell me how you managed to reflash them to use with contrl hab, which library did you use?
We're using 11 GoBilda servos and one Savox servo, not Dynamixel servos. I'm not sure what you mean by reflashing them using the control hub.
How cool is that. Congrats guys. How you do the real time detection? Open CV?
All of our vision algorithms are completely original this year. The logic described in the video is run in a background thread, and when highly optimized, it can cycle camera frames extremely quickly that allows us to do real-time detection. If you are in the FTC Discord, there was another example of real-time detection that also used a highly-optimized version of our code: discord.com/channels/225450307654647808/771188718198456321/946616121451249725
@@highdefinition6017 COngrats and keep going.
Simply awesome. You have a repository for your code?
Yeah, the repo is private as of now because it's not cleaned up, which we'll probably do if we release the code to public. Our shipping element detector code, however, is public here: github.com/HiiDeff/ShippingElementDetector
@@highdefinition6017 thank you
Well well well, now, isn't this a beauty? 🔥🔥🔥
Woah cool robot. No Teflon tho 0/10 :(
Maybe in the future...
:OOOOO
This is so sick! Wish i could be on this team
Omg you guys are absolutely crazy! Do you guys use a motion planning library like Roadrunner or did you custom make the movements? And as for the hybrid PID, how did you create a tuner for that? I can't wait to see you guys at worlds! I'm definitely going for y'alls pins if you have any lol
Nope, everything is custom made! For the PID tuner, we have a base class that handles all the normal PID logic which applies to all PID models (for any subsystem), and then all we have to do for a new subsystem is just implement a few methods (getError, setPower, cancel) and then it's ready to be tuned in tele-op.
I'm just curious but you said your robot has 25 sensors!? Could you list the one's you used?
1 IMU 2 Logitech Webcams 5 REV 2m Distance Sensors 6 Motor Encoders 12 Servo Encoders
Hi guys, How did you make the rotating Platform?
The rotating platform in both the delivery arm and the intake arm is a lazy susan bearing. For the delivery arm, it was big enough that we could buy it online. However, since the intake arm was smaller, we used Fusion 360 to CAD our own lazy susan bearing and 3D printed it.
@@highdefinition6017 but how you motorized it?
@@mateusbernart2101 I'll be explaining the intake arm lazy susan bearing, but the delivery arm works similarly with a motor in place of a servo. The Lazy Susan bearing has two parts, an inner part (mounted to the drivetrain) and the outer part. In between is a set of balls that allow for the rotation of the bearing. We have attached a gear connected to the outer part and the above plate. This is controlled by the servo you see on top of the bearing. When the servo rotates, it rotates the gear. Another gear is located in the inner part of the bearing. Therefore, both gears will move and the intake arm will rotate! If you don't understand my explanation, then we can also share the CAD file for you to take a better look at the components.
@@highdefinition6017 would it be possible for you to share the cad?
@@VeerNanda Added the link to the video description! :)
wowww!!
Woah... what kinds of mecanum wheels do you use?
6” mecanum wheels!
rapaaaaaaz
Nice robot. Can your intelligent claw detect the individual blocks at the start of the game when they are all clumped together?
Yep! We don't actually detect all of the blocks, only the closest one (lowest in the image). However, it does pick out one individual lowest block.
This is absolutely insane. Congrats on getting to worlds! Quick question: I noticed that in your detection code you have set it up to use a webcam. Is it possible to achieve this with the phone camera as well?
We actually haven't tried it with a phone camera before, although we've been asked this same question before. Our guess is you could try looking at the sample classes that is provided in the FTCRobotController, and see if there is something that can retrieve camera frames. When we tried this last year for ring detection (counting orange pixels instead), the only way we could retrieve an image from the phone camera was to use Vuforia to return an image (a process of which we've kind of forgotten the specifics of... but you could probably find online). We're not aware of another way currently.
@@highdefinition6017 Alright thank you. I'll try it out and let you know how it goes.
Hey guys! This is easily one of the coolest robots I have seen this season, only question I had is what you guys used for your intelligent claw, do you all use regular cameras or any special programs. Thanks again and congrats on making worlds!
We use a regular Logitech webcam (I'm not sure what specific model), and we don't use any libraries for detection; all the code used for processing the image is developed by our team.
Congrats on worlds!
Thank you so much! :D
This is really off-the-charts amazing! Well done!!
This is awesome. The robot is intelligent.
If anyone deserves the control award, it has got to be you guys!
Thanks!
How do you do PID based on robot constants to not need any tuning? That seems very useful.