Odometry 101 for FIRST Tech Challenge Robots

Поділитися
Вставка
  • Опубліковано 4 лип 2024
  • A deep dive into the math of a simple odometry implementation for FIRST Tech Challenge robots that was developed before and during the 20/21 FTC Ultimate Goal season.
    Here is our hardware design for the Odopods:
    cad.onshape.com/documents/5e3...
    You will need an OnShape account which is free for educational purposes.
    Omni wheels were purchased from RobotShop:
    www.robotshop.com/en/60mm-alu...
    REV encoders:
    www.revrobotics.com/rev-11-1271/
    0:00 - Intro
    0:49 - Skystone autonomous without odometry wheels
    1:24 - Gluten Free Inspiration
    1:50 - Building the odometry hardware
    3:55 - Hardware overview
    5:43 - The math behind odometry
    20:19 - The odometry code
    24:54 - A Test program and some calibration
    27:18 - Autonomous run with odometry
    The FUN interview with Steven and Peter from FTC 11115 Gluten Free at the 2019 MTI:
    • Behind the Bot FTC 111...
    There is an excellent video series done by FTC 9794 Wizzards.exe called the Odometry Spell Book:
    Part 1: Part 1: Building a Modified goBILDA Strafer Chassis Kit
    • Odometry Spell Book Pa...
    Part 2: Installing Odometers into Drivetrain
    • Odometry Spell Book Pa...
    Part 3: Wiring an Odometry Drivetrain
    • Odometry Spell Book Pa...
    Part 4: Understanding and Getting Started with the Odometry Software
    • Odometry Spell Book Pa...
    Part 5: Learning To Move To Specific Positions
    • Odometry Spell Book Pa...
    FTC 18219 Primitive Data has launched a website for an OpenOdometry implementation:
    openodometry.weebly.com/
    On popular demand, the actual code in our main odometry function (you will have to define the variables and do the initialization yourself):
    public void odometry() {
    oldRightPosition = currentRightPosition;
    oldLeftPosition = currentLeftPosition;
    oldAuxPosition = currentAuxPosition;
    currentRightPosition = -encoderRight.getCurrentPosition();
    currentLeftPosition = encoderLeft.getCurrentPosition();
    currentAuxPosition = encoderAux.getCurrentPosition();
    int dn1 = currentLeftPosition - oldLeftPosition;
    int dn2 = currentRightPosition - oldRightPosition;
    int dn3 = currentAuxPosition - oldAuxPosition;
    // the robot has moved and turned a tiny bit between two measurements:
    double dtheta = cm_per_tick * ((dn2-dn1) / (LENGTH));
    double dx = cm_per_tick * ((dn1+dn2) / 2.0);
    double dy = cm_per_tick * (dn3 + ((dn2-dn1) / 2.0));
    telemetrydx = dx;
    telemetrydy = dy;
    telemetrydh = dtheta;
    // small movement of the robot gets added to the field coordinate system:
    pos.h += dtheta / 2;
    pos.x += dx * Math.cos(pos.h) - dy * Math.sin(pos.h);
    pos.y += dx * Math.sin(pos.h) + dy * Math.cos(pos.h);
    pos.h += dtheta / 2;
    pos.h = normDiff(pos.h);
    }
  • Наука та технологія

КОМЕНТАРІ • 65

  • @nbluto
    @nbluto 2 роки тому

    Wow, this is a great overview of odometry. I have learned so much! Thank you so much.

  • @ftcdontblink5428
    @ftcdontblink5428 2 роки тому +12

    This video is super good. Loved how the equations are so simply derived!

  • @EXPLODINGETDOOD
    @EXPLODINGETDOOD 2 роки тому

    Thank you so much! This video will make the perfect base for a lesson on odometry for my vex edr team. This is well crafted and way easier to understand than the few other odometry videos

  • @alvictor1291
    @alvictor1291 2 роки тому +17

    An excellently produced video answering the age old question in the FTC community -- What exactly is Odometry? Love how you talked about a little bit of the history of odometry, covered hardware + software in a clear and easy to access manner. This video is definitely going to be a new reference for those trying to implement odometry.

  • @jozefsprunk607
    @jozefsprunk607 Рік тому

    GREAT video! helping my ftc team so much. still really struggling with implementing the code but a great start thanks to you!

  • @ilanmower
    @ilanmower 2 роки тому +29

    can odometry allow me to localize and find the meaning of my existence?

  • @andew2007
    @andew2007 Рік тому

    Love the team name!

  • @mrmehcrawley
    @mrmehcrawley 2 роки тому +2

    Probably the best educational video there is!

    • @DrBatanga
      @DrBatanga  2 роки тому +1

      I don't necessarily think so, but I'll take such praise any day!

    • @mrmehcrawley
      @mrmehcrawley 2 роки тому

      @@DrBatanga in my opinion it is👍

  • @thomaslarson6199
    @thomaslarson6199 2 роки тому

    Thank you for all of this. We will be looking to attempt this code this year. See you at the competitions! FTC 15304.

  • @bleh1323
    @bleh1323 2 роки тому +1

    Last year our team also tackled Odometry as well. It was a welcome improvement. (Of course we lost out free wheels this year....)

  • @RlxRlx1
    @RlxRlx1 2 роки тому

    so cool sir!

  • @4drobotics253
    @4drobotics253 2 роки тому

    Many thanks for your video ! Excellent details !

  • @glitchtime404
    @glitchtime404 4 місяці тому

    Trying to figure this code out was like trying to read ancient hieroglyphs with no water and the Rosetta Stone alone, but at least it explained it quite well so that’s nice 7.5/10 tutorial

  • @marcserraortega8772
    @marcserraortega8772 Місяць тому

    Thanks a lot!

  • @danielesteves3435
    @danielesteves3435 Рік тому

    Fantastic video. Please make more follow ups or upgrades. We needed this. Thanks from #4416

    • @DrBatanga
      @DrBatanga  Рік тому

      Will do. We are currently wresting with path following and OpenCV and I'm hoping the dust will settle soon.

  • @4drobotics253
    @4drobotics253 2 роки тому +2

    The video is amazing, but could you show us a bit of code for the XyhVector class?

  • @gonzaloreyes2609
    @gonzaloreyes2609 2 роки тому +2

    Love the video and thanks for your dedication.
    Just one question, probably a dumb one. For the field coordinates, why are you adding just the half of dThetha to calculate the X and Y position? I don't fully get it.

    • @DrBatanga
      @DrBatanga  2 роки тому +1

      Once you have an estimation for the displacement and change in the heading, you need to apply it to the field coordinates. You could (a) first apply the dx and dy and then dTheta or (b) dTheta first and then dx and dy. There will be a small numerical difference between (a) and (b), so I decided to split the difference by applying dTheta/2, then dx and dy and finally another dTheta/2.
      As with all numerical approximations, there will be a tiny error in each iteration. Luckily, some of the errors cancel out, but others may add up over time. The goal is to minimize the errors across the whole system starting from the mechanical sensors all the way to the numerical trickery.

  • @pwochnick
    @pwochnick 2 роки тому +1

    Thank you so much for creating this video. Is it possible to also see your code for your test auto program? Still trying to understand how you are able to move the robot the way you do in your test auto. Again, thanks for putting this together.

    • @respieces5129
      @respieces5129 2 роки тому +3

      Hey Paul!
      So we don’t plan on publishing our software, but here’s a little snippet from our code so I can give you a quick (but not really quick) explanation (we do all this in a LinearOpMode by the way):
      public void goToPosition(XyhVector[] p, double speedModifier) {
      robot.setupPosBezier(p, speedModifier);
      while(opModeIsActive() && robot.followingPath == true){
      robot.odometry();
      robot.goToPosBezier();
      }
      }
      For starters, we went completely custom with our movement algorithm, so it’s very specific to our robot, and all the other software we’ve written, which makes it difficult to explain over text, so I won’t go into much more detail than is necessary. That said, as you can see where the method is defined, we take two parameters: An array of instances of a method we call XyhVector, and another parameter called speedModifier. A singular XyhVector contains 3 values; an x position, a y position, and a heading (we’ll come back to why we take an array of these in a moment). speedModifier, the other parameter, is simply a multiplier for the speed of our robot as it moves. As the method runs, we setup our robot’s movements with that method setupPosBezier, which calculates various things relating to our movement algorithm, like the path the robot will follow based on the points within a given array (p in this case, our previously defined XyhVector array), the time it should take to get there, the cutoff point at which we will stop following the path if it takes too long, and the distance we have to move/turn. Afterwards, a while loop begins, which lasts as long as the op mode is active, and our robot doesn’t take too long to reach it’s desired destination. Within the loop we have an instance of the familiar odometry() method (shown in video) and goToPosBezier, which takes the results from the setupPosBezier and translates that into movement of the robot. This while loop is crucial to the success of odometry in autonomous, as you can’t constantly track your robot’s position in the period without some way to constantly get new input. Hope that helped to answer your question, feel free to ask more if you’re curious or I just flat out didn’t give you the information you wanted!
      TL;DR: We run our movement code in the same while loop as our odometry readings, which constantly updates our position allowing us to move accurately

    • @pwochnick
      @pwochnick 2 роки тому +1

      @@respieces5129 Thank you do much for sharing the information that you have. I really appreciate it.

    • @DrBatanga
      @DrBatanga  2 роки тому

      Most of our movements are on straight lines. We chop up the line into smaller segments, depending on the length, then use a P-controller to follow the waypoints. That includes partial angles for the turns. We also experimented with Bezier-curves for more complex movements, but the team was not totally satisfied with the performance/speed. I guess there is always a next season to make improvements...

  • @shailesh9rai
    @shailesh9rai 5 місяців тому

    Hi, I’ve 3 wheel odometry but I cannot place my 3rd wheel exactly aligned to centre of rotation. In fact I’ve to place it near my rear wheel. I put feed forward offset as negative value but how do I accommodate angle in the code? Any suggestion what to do when centre of rotation and third dead wheel is not aligned?

  • @eleelena1208
    @eleelena1208 6 місяців тому

    Hello! This video helped me a lot to understand the concept of odometry. Sadly, when our team ordered the materials for this season we could only afford 2 rev encoders. I was wondering if there is any way we could implement odometry just by using two encoders on the x axis.

    • @DrBatanga
      @DrBatanga  6 місяців тому

      Yes, I can think of several options here, though I don't have any hands-on experience on those. In essence you need to track 3 parameters of the robot, moving forward/backward, moving left/right and turning, so you also need 3 sensors in the general case, but you could get away with 2 sensors if you don't move the robot in all 3 orientations at the same time.
      If you have fixed wheels (not mecanum or omni) the robot can only go forward/backward and turn, but it cannot go sideways limiting the need of sensors to 2. We have use that in the past and it worked reasonably well for not too complex autonomous programs, by just using the motor encoders.
      If your robot uses a mecanum drive base, you could use the IMU in the controller to get the turn angle for the robot and use the two odometry sensors for x and y. However the IMU's angle tracking is not extremely accurate, but it may work just fine.

  • @kapilraut3763
    @kapilraut3763 8 місяців тому

    Please tell me what the delta n represents exactly

  • @shishghate
    @shishghate Рік тому

    Great video, is the rest of the code posted someplace? Especially the part on line 46? The goToPosLinear method? We are curious to see how that works.

    • @DrBatanga
      @DrBatanga  Рік тому

      The short answer is no. Basically, I recommend some P or PID controller to follow a given path. There is no one-size-fits-all solution though. It depends on the hardware (mecanum vs. holonomic etc) and there is always a compromise between speed and accuracy.

  • @gabogaona1282
    @gabogaona1282 8 місяців тому

    Hi, im the captain of a mexico´s ftc team, i have some quiestions about the coder, what library did you use to define the XYHpos and where can i find it?, i´ve been trying to use the odometry in our robot and your video its so well explainded but i have those questions, i´d be glad if you can answer it, THHAAAANKSS

    • @DrBatanga
      @DrBatanga  8 місяців тому

      There is no official library or third party code that I could share. All you need is a class that holds the three variables for x, y and h (heading). See code in the description of the video. Getting the odometry values is one thing, following a path is something you need to add on top.

  • @RichRuggeri
    @RichRuggeri Рік тому

    Team 6418 Question for the two X encoders couldn't you use the actual motor encoders realizing of course that their resolution/tick count will be lower than the omni wheel version. Wouldn't this simplify things because you would only need to add one omni wheel for the Y axis to the robot?

    • @DrBatanga
      @DrBatanga  Рік тому

      Good question. Yes, you can do odometry in many different ways, but the results will differ too, and it all depends on your drive base. When using motor encoders, you run into issues with wheel slippage and you don't register bumping into walls or other robots. We have looked at different approaches over the years and they all worked more or less:
      (1) 2 fixed drive wheels + 2 omni wheels or a caster wheel. There is a general closed mathematical solution for this type of robot which is somewhat complicated to implement, but you can divide your path into straight lines and tunes to make the software simple. We used that years ago on our Lego robots and early FTC robots.
      (2) Mecanum wheels + IMU. Mecanum wheels are fairly good driving, but strafing cannot be tracked reliably with encoder ticks from the motors. We have used mecanum wheels in conjunction with an IMU in the past and got fairly good results.
      (3) Holonomic drive base (4 omni wheels). Very inaccurate odometry based on drive wheel encoders. You'll need dead-wheel odometry.
      To answer you question, my gut feel is to go with either (1) or (2) or implement the complete 3 dead-wheel odometry. The latter one gave us the best results by far, but the other two can be programmed and tweaked good enough as well.

    • @RichRuggeri
      @RichRuggeri Рік тому

      Team 6418 says hanks for the very prompt reply

  • @RejatKrishnan
    @RejatKrishnan Рік тому

    so do you call the encoders as motors in the configuration on the phones?

    • @DrBatanga
      @DrBatanga  Рік тому

      Yes, they are set up as dcMotors and need to be named and defined in the REV controller accordingly.
      There are two ways of using the encoders:
      (1) They are plugged into a port to which a motor is also connected. In this case the motor cannot use its own encoder and has to run without encoder. You can still read the encoder values from the odometry encoder plugged into the port. To make the program more readable, assign a new variable to the existing motor:
      motorSlides2 = hardwareMap.dcMotor.get("motorSlides2");
      motorSlides2.setDirection(DcMotorSimple.Direction.REVERSE);
      motorSlides2.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
      motorSlides2.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
      encoderLeft = motorSlides2; // encoderLeft shares a port with motorSlides2
      (2) If you don't use all 8 motors on your robot, you can connect the odometry encoder to an open motor port. In this case we use this setup:
      encoderRight = hardwareMap.dcMotor.get("encoderRight");
      encoderRight.setDirection(DcMotor.Direction.FORWARD);
      encoderRight.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
      encoderRight.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
      Hope this helps. Good luck this season!

  • @OrphanBots
    @OrphanBots Рік тому

    what library are you using or what imports, nothing from FTCLib (not as far as I've found anyway) have that XyhVector stuff, or am I encountering a problem unrelated to the libraries and imports

    • @calebstanziano6033
      @calebstanziano6033 9 місяців тому

      I'm having that problem too. I see that it is a class he had but I have little idea of what is inside it.
      By the way this video is amazing. I just started learn odometry and you made it so simple. Thanks.

    • @calebstanziano6033
      @calebstanziano6033 7 місяців тому

      I had to create that class. It wasn't too difficult or anything.

  • @frankjowitt6205
    @frankjowitt6205 2 роки тому

    which omni wheels are you guys using?

    • @DrBatanga
      @DrBatanga  2 роки тому +1

      We use the 60mm omni wheels from RobotShop.
      www.robotshop.com/en/60mm-aluminum-omni-wheel.html

  • @jargonian9758
    @jargonian9758 Рік тому

    Hey! What is the purpose of telemetry dx, dy and dtheta? Is that stored for user reference? Also, what is LENGTH? Is that the length of the robot.

    • @youfu90
      @youfu90 10 місяців тому

      you are asking about the equation or the code?

  • @hi_beemo1808
    @hi_beemo1808 Місяць тому

    I ince was doomed by the high speed and low torque of my robot in a compitition does anyone know if there is a way to control torque mechanically

    • @DrBatanga
      @DrBatanga  Місяць тому

      Roughly speaking torque * speed = power. So since the FTC motors have a fixed maximum power they can produce, you either can choose between low torque+high speed or high torque+low speed and everything in between. And that's done by selecting the appropriate gear box for the motor. If you don't have the exact motor+gearbox you need, you can further gear down your wheels by an external gear set or pulleys with a timing belt or sprockets with a (plastic) chain.

  • @ernestotu5350
    @ernestotu5350 Рік тому

    Hello Please which battery did you use for this robot

    • @DrBatanga
      @DrBatanga  Рік тому

      We were using the "regular" Pitsco batteries. REV has flat ones too. Pitsco has become very expensive, not sure who sell "legal" batteries in the $50s nowadays. Google it.

  • @gabealimov1479
    @gabealimov1479 Рік тому

    Do you guys have a source code posted anywhere?

    • @DrBatanga
      @DrBatanga  Рік тому +1

      The short answer is no, but since you asked, I added the main odometry function from our live code to the video description. Hope this is enough to get you started.

  • @crabbydood933
    @crabbydood933 Рік тому +2

    Can’t you just have the robot spin in a circle and divide each of the encoder values by 2pi to find the exact distance from the center instead of measuring by hand .

    • @DrBatanga
      @DrBatanga  Рік тому +2

      Yes, you can and we actually did. It's more accurate that measuring by hand. Before that, you also should drive the robot in a straight line and record the distance and encoder ticks. With that, you get the effective wheel diameter which may be off a bit from the nominal 60mm.

  • @herambsawant50
    @herambsawant50 8 місяців тому

    Hello I am from team sigma #20890. I wanted to know if there is any way where we can avoid restarting of control hub after uploading the code?

    • @DrBatanga
      @DrBatanga  8 місяців тому

      You should be able to upload new code onto the control hub without restarting it, at least most of the time. Occasionally, something gets stuck and requires a power cycle. If you have to restart your control hub each and every time you upload code, something's wrong. But that's impossible to troubleshoot with UA-cam comments. Contact another team close to where you live and ask them for help on site.

  • @spidernh
    @spidernh 2 роки тому +1

    I gotta say... F, sad that you did it this summer and not the summer before or anything

    • @DrBatanga
      @DrBatanga  2 роки тому +2

      LOL. I'm not taking any responsibility for the barriers, but there is hope for the 22/23 season...

    • @spidernh
      @spidernh 2 роки тому +1

      @@DrBatanga yeah, I was hoping to do it this year too but then we switched chassis and also the barrier. At least I can still use Roadrunner, which will be an improvement from last year.

  • @Va_aV_
    @Va_aV_ 9 місяців тому

    Instead of ticks? Will multiplying wheel revolutions * wheel circumference also give us a similar result for distance travelled by wheel???
    :)

  • @alteru1626
    @alteru1626 Рік тому

    WALTAH

  • @kenusaga
    @kenusaga Рік тому +1

    In your code in the notes what is:
    pos.h = normDiff(pos.h);

    • @DrBatanga
      @DrBatanga  Рік тому

      As the robot keeps turning, the heading or angle may increase to very high values. It's a good practice to keep them bounded within one rotation, let's say between +/-180 degrees or +/- pi in radians. That's what the normDiff function does.
      Keeping track of the heading is a tricky business as you turn the robot and needs careful thought. For instance if the robot points to -150 degrees and you want to turn it by 90 degrees clockwise, you will end up at -240 degrees which is the same as +120 degrees.

    • @mohamedazimal3187
      @mohamedazimal3187 Рік тому

      @@DrBatanga can we use atan2((y_goal - y_current),(x_goal - x_current))?