how can we generate non ego vehicles randomly (again and again in different positions) in the surrounding environment of ego vehicle during simulation of scenario
Is there a reason for that unusual coordinate system??? Can that be changed? (why x- and y-axes are mixed up and the y axes runs from positive to negative values?) It constantly makes tasks just a little bit tougher because one has an additional detail to take care of ...
Great video! Very helpful and easy to understand.
Thank you for your very clear explanation. The capabilities are really exciting.
Cool!
Thx
Thanks so much for your great work!
I have a question: can I change the Ego-centric view to the driver view field?
can i use same scenario for "SELF DRIVING TRACTORS"
how to use the exported sensor data in code. When I am using sensor_data in the code it is giving an error saying undefined variable
kindly share the link for different Driving scenario examples
how can we generate non ego vehicles randomly (again and again in different positions) in the surrounding environment of ego vehicle during simulation of scenario
Cool..Can I add airbag module in this test case scenario? If yes, Please let me know how.
How did you get your simulation to run so smooth? Mine lags so hard and I don't know what to change to make it run better.
hello, can we combine this Matlab code with Arduino?
Is there a reason for that unusual coordinate system??? Can that be changed? (why x- and y-axes are mixed up and the y axes runs from positive to negative values?)
It constantly makes tasks just a little bit tougher because one has an additional detail to take care of ...
Can we add more modules like RF modules for the purpose of V2I communication.....!
Can we check our autonomous robot code on this?
is there an option for ultrasonic sensors?
no there is not .
If there is no sensor (prebuilt) in this app then how we can add our own sensor in this
牛逼