Unity tutorial: AR Indoor Navigation with Vuforia Engine - Workflow, data model & app demo

Поділитися
Вставка
  • Опубліковано 12 вер 2024

КОМЕНТАРІ • 135

  • @infiole7159
    @infiole7159 2 роки тому +3

    Looks great! I have used your previous demo video to understand more about AR Indoor Navigation when I was first looking into it. I have since created my own application on iOS with ARKit and Microsoft Azure Spatial Anchors.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Nice, great to hear you were able to build an app by yourself! How many point clouds do you use in your app? How was it to use Azure Spatial Anchors (from easiness point of view, availability of documentation, examples etc.)? I don't have experience with them at all.

    • @infiole7159
      @infiole7159 2 роки тому +1

      @@joshuadrewlow All of the point cloud manipulation are deferred to the SDK, you would just have to provide the anchor transform and it save the anchor to the cloud when it has enough data. As for re-localisation, just provide the identifiers from when you created the anchors and point the camera in the general area you saved the anchor. From there it would pull all the meta data you saved with the anchor and you can repopulate the scene.
      The documentation is pretty detailed and covers a wide range of development platforms (Android, iOS, Unity). It also has examples that are quite detailed.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@infiole7159 From checking out the docs really quick it looks really solid. Cool! I assume the anchors are markerless. So you don't need a reference image. Otherwise it would not be called spatial anchors. What are the limits of one anchor and how do you record it?

    • @Zehra-jr1cb
      @Zehra-jr1cb 4 місяці тому

      Hey, I am a university student and my group and I are making a AR Indoor Navigation Project using Android Studio and Azure Spatial Anchors. We need some guidance in the navigation task. Can you please help us out?

    • @joshuadrewlow
      @joshuadrewlow  4 місяці тому

      Sorry, no experience at all with Azure anchors 😕

  • @patrickscheper
    @patrickscheper 2 роки тому +3

    This is awesome! Great content.

  • @stefanwentink
    @stefanwentink Рік тому +5

    Hi Joshua, great video! I was wondering; How did you create the startscreen/dashboard and the static UI parts in the AR interface? I am pretty new to Unity and my AR part seems to work now, now I would like to create a dashboard with several modes!

    • @joshuadrewlow
      @joshuadrewlow  Рік тому +2

      Hi there, thanks! I used the conventional 2D canvas GameObject: docs.unity3d.com/2020.1/Documentation/Manual/UICanvas.html

  • @adityanjsg99
    @adityanjsg99 4 місяці тому +2

    I instantly subscribed.

  • @maheshp2048
    @maheshp2048 7 днів тому +2

    Hi can i create the map and the path and points using mobile app, since using unity for everything is a issue right, i thought about giving a admin part, where admin can scan the environment with the app and then mark the navigation, is it possible

    • @joshuadrewlow
      @joshuadrewlow  7 днів тому

      Good question! Wanted to make a project about this a while ago. Yes it's possible to do it by using the Area Target Capture API, check it out here: developer.vuforia.com/library/develop-area-targets/capture-api

  • @AghayevA
    @AghayevA 2 роки тому +4

    Great job! We have been doing something similar for the past year but run into a problem of storing multiple datasets on the device. Have you found a solution to limiting the app size and storing the datasets on a CDN of some sort?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +2

      That sounds awesome! The two biggest issues for app size are the Area Target mesh used for occlusion and the point cloud database.
      For the mesh we found two ways:
      a) reduce the quality of the mesh (it just looks crappy in the editor) by reducing the resolution of the textures or
      b) you model all occlusion objects low poly by yourself and attach an invisible material that obscures what is behind (this is also a great method to improve imperfect scans btw.).
      The point cloud databases (in Vuforia docs they are referred to as DataSet) you can load as Asset Bundles at runtime and activate them when you need it: docs.unity3d.com/Manual/LoadingResourcesatRuntime.html
      Here is an instruction on activating & deactivating DataSets: library.vuforia.com/features/environments/area-targets/area-targets-native-workflow.html
      We haven't implemented the loading as Asset Bundles because of limited project time, but I'm very confident it will work, because a DataSet is a "normal" file that can be picked up in the script. Does that make sense?
      Let me know please when you managed to load them to your app! I'm also planning to extend our app with this function sooner or later :)

  • @ekeshkumar-rc4hb
    @ekeshkumar-rc4hb Рік тому +1

    hi sir ,...thank you so much for all of your help

  • @afiqahbasir7638
    @afiqahbasir7638 2 роки тому +2

    Awesome! This video is really great and help me in finding ideas for my research. Can I ask you, is it possible if I use Microsoft Kinect for Xbox 360 to scan the 3D environment? Can I know what are you using for the hosting?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Hello Afiqah, thanks for stopping by! Unfortunately Microsoft Kinect is not listed as a supported camera here: library.vuforia.com/features/environments/area-targets.html
      About your 2nd question: what do you mean with hosting? Do you mean the "hosting" of the point clouds?
      We used two methods: 1) creating point cloud with Matterport Pro2 camera, upload to Matterport, buy MatterPack, generate Vuforia Area Target with Vuforia Area Target Generator desktop app (point cloud is downloaded only once from Matterport, so we cannot really talk of hosting) and import into Unity. 2) creating point cloud (already in form of Vuforia Area Target) with the iOS Vuforia Area Target Creator App and import into Unity.
      With the new Vuforia Engine update 10.3.2 it is now possible to call the API to create a Vuforia Area Target inside your own app (only iOS for the moment) which is pretty cool!
      Does that help you out? Cheers ✌🏻

    • @afiqahbasir7638
      @afiqahbasir7638 2 роки тому +1

      @@joshuadrewlow Thank you Joshua for your response! Is it possible to create indoor AR navigation app without creating Vuforia Area Target?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      @@afiqahbasir7638 Depends what you want to do exactly. You could use smaller point clouds like Model or Image Targets and only show an arrow into which direction the user should go. But as soon as you walk away 2 - 5 meters your device would not have enough data to show AR content correctly. The "Vuforia Area Target" is a brand name of the Vuforia Engine and it technically is an optimised point cloud with a real world scale (you can image it like a digital twin of the environment). The benefit of Area Targets is they can be very large. One of our Area Targets is around 650 m² big. Does that make sense and answer your question? Best regards, Joshua

    • @afiqahbasir7638
      @afiqahbasir7638 2 роки тому +1

      @@joshuadrewlow Thanks Joshua!

  • @allensun6329
    @allensun6329 Рік тому +1

    Nice video, I am wondering the purpose of transfering point cloud to mesh? Could you directly operate on the point cloud data?

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      The mesh is additional to the point cloud. The point cloud is used for localisation and the mesh is used for occlusion. But it can also be deactivated. This would make sense if there are a lot of Area Targets on a device and you want to optimise app size. No you can’t directly on a point cloud. Maybe you can with the point cloud data file but I’ve never tried it. What do you have in mind to operate?

    • @allensun6329
      @allensun6329 Рік тому +1

      @@joshuadrewlow I am thinking of directly putting anchors (area targets) in a virtualized point cloud map. So that you don't have to waste computing power to transfer the point cloud to mesh. I know UE5 could let you directly operate on point cloud, not sure if Unity can do it.

    • @allensun6329
      @allensun6329 Рік тому +1

      @@joshuadrewlow I think Unity could handle virtualized point cloud (lots tutorials on internet). Btw I really like your video, it toughtme a lot. Have you ever tried to use ARkit/ARCore to do it?

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      What computing power do you mean? Because the mesh is generated when the AT is generated. I never heard about virtualized point cloud. Do you have a good example at hand? I did take a look at ARCore and ARKit in the beginning of the project, but Vuforia had less limits and much easier point cloud creation. Also documentation for Vuforia is much better. It’s just really expensive.

  • @antonlindholm7380
    @antonlindholm7380 2 роки тому +3

    Hi!
    I am working on a similiar project and i have a couple of questions, Right now i calculate the path with a Nav mesh agent that is set on a 3d Cylinder, how do u change so its calculates from the actual ar camera instead?
    How do u implimenent the direction arrow? Right now im using Line renderer but it only gives me an line that goes to the direction of point.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Nice! In what environment are you doing your project?
      Navigation line from AR camera:
      Set the NavMeshAgent as a child of the AR camera. Then I also implemented a simple script that he always stays on bottom of the camera and on the NavMesh.
      Direction arrow:
      To put it simple, place the arrow above the next corner further away then 2 meters and let it "look" to the next corner. It doesn't work all to smooth because corners are changing a lot, but we didn't have enough time to optimise it.
      I just recorded another video today where both of this is covered. Just need some time now to cut it, and do all the publishing stuff. But I hope the explanation above will help you already a bit. Cheers 🤙

    • @antonlindholm7380
      @antonlindholm7380 2 роки тому +1

      @@joshuadrewlow
      I am doing my project in unity :)
      Okey thanks for the answer, havent figured out to get it right with creating navmeshagent as a child to the ar camera but will continue today!
      When are u planning on uploading the next video?😊

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@antonlindholm7380 Sorry, with environment I meant: At which location? School, home, office, etc. I plan to upload it still this week.

    • @antonlindholm7380
      @antonlindholm7380 2 роки тому +1

      @@joshuadrewlow oh okey haha, its suppose to be at school so new students can easy find to different classrooms

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      @@antonlindholm7380 Hello there. Sorry I will not be able to upload the video today. I need to redo the whole thing because during cutting I realized I was all over the place and missing some useful information. Cheers and have a good weekend :)

  • @clifflin7149
    @clifflin7149 2 роки тому +1

    Thank you for the great video! A few questions about it:
    - For the Matterport approach, is it so that only the point-cloud is the input for Vuforia area target creator? If I understood correctly the mesh is not necessary.
    - Matterport software also support 360 camera (e.g. Insta One X), is that possible to generate the pointcloud for this solution? (which is cheaper than Matterport scanner)
    - What if the map is outdated and need to re-scan the floor? then the app needs to be re-built?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +2

      Loads of good questions! 😏
      1) Input for Vuforia Area Target Creator: I think it's not only the point cloud, but also the images (images are used for the texture on the mesh). I don't think it works with raw point cloud data only, because what they want is the MatterPak. A good description about MatterPak I found here: www.thefuture3d.com/matterpak
      2) Unfortunately 360 cameras are not supported according to this Matterport page (click on "compare cameras": matterport.com/cameras/360-cameras I forgot to mention that in the video: In the meantime Matterport supports more cameras, but not all 3D capturing cameras collect enough data or good enough for point cloud creation. What you eventually need is a Matterport scan where you can buy the MatterPak package. There is also a list of supported cameras in the Vuforia docs, also of other companies than Matterport, but they are all expensive (probably because the tech and algorithms needed for point clouds is quite expensive and complex): library.vuforia.com/features/environments/area-targets.html
      3) Outdated maps: It depends how you build the app. In a simple app yes you would have to reimport the new scan as AreaTarget, update the points of interest and rebuild the app. But you can also do it more complex: In an client server architecture you can load Area Targets as Unity Asset from a server in the app for example at start up. The NavMesh used for the navigation can be generated also on runtime, after you downloaded the new Area Target. The points of interest you could update for example over an admin panel, which is connected with a database, which you could load on app start for example. Does that make sense for you?
      Cheers and happy coding 🤓🤙

    • @clifflin7149
      @clifflin7149 2 роки тому +1

      @@joshuadrewlow Thanks a lot for your explaination in such detail! One more short question: with Matterpak, is it possible to somehow download and import the generated 3d model into Unity? (without Vuforia)

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@clifflin7149 U r welcome :) I'm not sure if this is possible, I haven't tried that. Parts of it yes, for example floor or ceiling plan, but I don't know about 3D model. I strongly think that the 3d model is created from Vuforia Area Target Generator. If you could, you wouldn't be able to use the data for the Vuforia Engine, because they have their own database and files and stuff. What do you need the 3d model for?

    • @clifflin7149
      @clifflin7149 2 роки тому +1

      ​@@joshuadrewlow If it is possible to download the 3D model generated by Matterport, we can build ar/vr app out of it. Example for people to view the space in VR remotely. I think the 3d model should have nothing to do with Vuforia, as it's from Matterport.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      @@clifflin7149 I checked out the first link I sent you and it looks like that it is possible! What you get is a OBJ file, which is a mesh with texture: "An OBJ file lets 3D developers and VR enthusiasts to kickstart their projects with a 3D model of a real-world place." From a quick google search it looks like that you should be able to import .obj into Unity.

  • @imEarth999
    @imEarth999 2 роки тому +1

    Do you have any tutorial how to make an demo app of navigation and choose the destination like the clip that you just shown?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Well yes this is part one of this tutorial 😂🤷🏼‍♂️ here is the second part if you are looking for how to set up the navigation: ua-cam.com/video/5OjcpyV-N0Q/v-deo.html
      Or are you looking for a tutorial on how to do a scrollable list to select the destination?

    • @imEarth999
      @imEarth999 2 роки тому +1

      @@joshuadrewlow Yes I'm looking how to do a scrollable list to select the destination.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@imEarth999 There are numerous tutorials that cover dynamic scroll lists, for example this one: ua-cam.com/video/hlNaNtApIMk/v-deo.html
      My tutorial only cover the localization and navigation parts. Unfortunately I can't find the tutorial I used back then, but it is similar and with a bit more complexity.

  • @dandi_saputra
    @dandi_saputra 9 місяців тому +1

    I have some questions for my assignment project at college.
    Does environmental object scanning have to use iOS? Can it be done on Android? And do you have to scan the object? Q

    • @joshuadrewlow
      @joshuadrewlow  9 місяців тому

      Thanks for dropping by! iOS is the cheapest version. By using the Vuforia Creator App. It can be used only with LiDAR supported devices. Android is not supported. Professional 3D scanners can also be used like Matterport, NaVis or Leica.

  • @prabalsharma3007
    @prabalsharma3007 Рік тому +1

    Hi Joshua,I have created the model on my iphone 12 pro through 3d live scanner by Laan Labs,can you please help me in which format should I export it,and how to import it in vuforia area target generator from there?

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      Can you export it to E57 data? If yes you can use the Area Target Generator desktop app to generate an Area Target: library.vuforia.com/creating-area-targets/area-target-generator-user-guide
      If this doesn't work it's best to use the AT creator app: apps.apple.com/ch/app/vuforia-area-target-creator/id1525517431
      Cheers

  • @ruiguimaraes4346
    @ruiguimaraes4346 Рік тому +1

    Hi Joshua,
    Im trying to do something like you have done, but I have some problems. I used to use Vuforia Area Target in Unity, but when I updated my Vuforia version, it was starting the errors.
    I never had this type of error when I use Area Target.
    I have done the scan with iphone, and updated to Unity. But gives me this error:
    MissingMethodException: bool UnityEngine.Texture2D.Reinitialize(int,int,UnityEngine.TextureFormat,bool)
    Vuforia.Image.CopyBufferToTexture (UnityEngine.Texture2D texture) (at :0)
    Vuforia.Internal.Utility.GLTFLoading.GLTFTexture.GetTexture () (at :0)
    .
    .
    .
    Already happen to you? Do you know any solution?
    Thanks,

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      No, I don't remember having an error like this. From which to which version did you update? Do you have the repo somewhere? You could send me a link if you want: josh@firefighter-ar.com

  • @HaadiaJaved-l3l
    @HaadiaJaved-l3l 2 дні тому

    Can we navigate a path or rooms in AR using GPS ?

    • @joshuadrewlow
      @joshuadrewlow  День тому

      GPS can help to find the approximate location, floor, etc to about 10m of accuracy. To show a path in AR you additionally need a point cloud to match the exact location of the phone/device camera. Does that make sense?

  • @92sphere
    @92sphere Рік тому +1

    Hi, is it important to scan and have a pretty realistic model ? Can i achieve the same thing if I only have simply model that I built with relative scale compare to the real area? I'm currently working on my graduation project and I'm stuck in the part of scanning area.😢

    • @joshuadrewlow
      @joshuadrewlow  Рік тому +1

      You need to better understand in detail how this stuff works. It is not the model that is used for localisation. The model is only a side product that helps with development. What is used for localisation is a point cloud. This is a dataset of unique points in the environment, like a digital twin. During the localisation process (this happens actually all the time with the device camera) the phone creates a point cloud of what it’s currently seeing and compares it with a previously scanned point cloud to find it’s precise location. Vuforia requires that you create point clouds (they call it “Area Targets”) with certain methods, the cheapest one is with the free iOS app. All you need is a device with LiDAR support. More information here: library.vuforia.com/environments/area-targets

    • @92sphere
      @92sphere Рік тому +1

      @@joshuadrewlow Thanks, I get it now. Previously, I was planning to use gps coordinate since I only need objects to appear in a specific location.

  • @sylv3d799
    @sylv3d799 2 роки тому +1

    Hi!
    this project is awesome! how can there be an automatic localization with point clouds in space? Does vuforia do it automatically or did you create a script? Is there a starting point for the tour? Thanks in advance 😊

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +2

      Good questions! 🧐 Let's try to answer this in order.
      Automatic localization: Vuforia Engine can "only" localise automatically inside 2-3 point clouds (Area Targets) at the same time. They all have to be activated and performance depends on the amount of detail. For more than 2-3 point clouds you need to handle activation of Area Targets by yourself. You have different possibilities: GPS, QR code scanning, asking user about approximate position in UI, looping through available Area Targets. We did the last two things, but last was not so successful. Once localised you can enable other Area Targets by for example Collider Triggers.
      Starting point for the tour: Do you mean the guided tour feature or navigation in general? Both can be started from anywhere, because we ask user about which floor he is on and then per floor we let all Area Targets activated. We only had 3 Area Targets because of lack of time, but we handled it so it cab be extended. Does that make sense to you? Cheers 🤙

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +2

      Maybe you can also check out the other video where I explain more about localisation: ua-cam.com/video/5OjcpyV-N0Q/v-deo.html
      I also planned to make an other video specifically about multiple point clouds.

    • @sylv3d799
      @sylv3d799 2 роки тому +1

      @@joshuadrewlow Thank you very much for your answer, that's exactly what I wanted to know.
      I have encountered an issue with the LineRenderer:
      It works well in Unity but disappears after build.
      Its seems to work when Occlusion Mesh and Mesh Collider are disable (in the Area Target Behaviour Script) but the line is not in the right position
      I can't solve it after trying many things... 😔
      Have you ever encountered this problem?

    • @sylv3d799
      @sylv3d799 2 роки тому +1

      Here is a short video that shows this issue
      ua-cam.com/video/QQ8oMsNzd30/v-deo.html

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@sylv3d799 Now the comments are visible.. Really weird UA-cam. But I heard from other UA-camrs that they have similar problems.. Imagine the millions of comments on UA-cam everyday... It is a miracle they all stay anyways 😆 Did you see my comments on your video?

  • @junyang1710
    @junyang1710 2 роки тому +1

    Hi, what plan do you use about Vuforia (free, or Premium Plan)? free plan is limited 20 targets, so a large area is not enough .

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Hi Jun, I'm using the free (basic) plan. If you are using the Matterport Pro2 camera you can cover with each area target around 400 square meters. This would give you a total size of 8000 square meters, which is not too bad. Although when it comes to publishing you will definitely need a premium plan. I contacted Vuforia for the premium price and it is 25'000 USD per year, fix price, no customisation. I know this is a ridiculous high number (that's why they don't put it on the website) but you have to bear in mind that you only want that when you publish it, so you probably have a client, a budget, etc. Long story short: The free plan is for prototyping only, after that you need loads of money for a production app.

    • @junyang1710
      @junyang1710 2 роки тому +1

      @@joshuadrewlow thanks lot

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      You are very welcome 🙂

  • @TGFanatic
    @TGFanatic 2 роки тому +1

    Hey I wanted to know how do u set the location of the destination using the Vuforia Area Target Creattor App? Do u create ur own DB with all the points, Like how would the app know which way to make the turns or is done auto because of the Navmesh

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      Hi there, I don't quite understand your question, but I will try to respond anyway: The Area Target Creator app can only be used to create Area Targets. They then can be transferred manually from the iPad to the computer and added to an (in our case Unity app) for localisation. The Area Target has it's own database of points, which can be activated and deactivated. The line to a destination point is drawn from a NavMeshAgent (located under AR Camera) to a destination inside an Area Target. Does that make sense & answer your question?

    • @TGFanatic
      @TGFanatic 2 роки тому +1

      @@joshuadrewlow Perfect, exactly what I needed Thank you so much brother.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@TGFanatic Thanks for the feedback 😁👍 To be more precise: of course there can be multiple Area Targets. Handling the navigation and the augmented objects just gets a bit trickier. Cheers

  • @SumitKumar-fn3gj
    @SumitKumar-fn3gj 2 роки тому +1

    Thanks for this. You are legend

  • @Arnab_rider
    @Arnab_rider 2 роки тому +1

    HI Joshua, This is an amazing illustration.
    I am having some blocker. I have some area targets with me, which I have already imported in Unity. Nav mesh is also baked, the area target is not of my place, I want to test the application inside unity only, I am using vuforia 10.5, which actually enabling a video plane. How can I simulate the whole experience inside unity editor, camera movement can be with arrow keys

    • @Arnab_rider
      @Arnab_rider 2 роки тому +1

      Got it...

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      Hi Arnab, thanks for your comment. You need to enable Simulator in the Vuforia Behaviour script properties which is attached to the AR camera object. And then you should be able to start the simulation with the play button above the Unity scene editor window. Did that work?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@Arnab_rider dang I was too late 😜 here is the documentation with more details library.vuforia.com/unity-extension/vuforia-play-mode-unity. You maybe also want to check out the Recording play mode which enables debugging in Unity with a real AR session recorded on site. Cheers 🤙

    • @Arnab_rider
      @Arnab_rider 2 роки тому +1

      @@joshuadrewlow yeah previously I was forget to mark area target as navigation static.. It is working fine.. Thanks for quick reply

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      @@Arnab_rider depending on your world center mode a static object will not always work, because the Area Target is placed relative in the scene to your AR session starting point. It works in the editor because you don't have a real AR session. The NavMesh should be baked as NavMeshSurface and should be child of AreaTarget, so it always corresponds with the real environment. Hope that makes sense...

  • @Sudheer.MS_dancer
    @Sudheer.MS_dancer 2 роки тому +1

    Is there any alternative android app for scanning and uploading the scene in unity

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      According to Vuforia docs no. With the new update the Vuforia Engine has an API to create point clouds in your own app. I haven't looked at the feature yet, but probably it is available for Android and iOS.

  • @user-gc1hg9sp9k
    @user-gc1hg9sp9k Рік тому +1

    do we need Iphone to scan the object?

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      What object? This use case uses the scanning of an environment. And yes an iPhone with LiDAR support can be used.

  • @hasyafadzil3278
    @hasyafadzil3278 2 роки тому +1

    Hi Joshua! How do I contact you personally? I'm currently doing this project where students can navigate bookshelves location in library?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Are you on LinkedIn? If yes the please contact me there. Otherwise I can post you my email address here.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      You can also ask questions here, so everyone can profit from it :)

    • @hasyafadzil3278
      @hasyafadzil3278 2 роки тому

      @@joshuadrewlow how do you connect your app to navigation? Using visual studio?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      What do you mean with "connect your app to navigation"? Navigation is a feature of the app. And yes I use Visual Studio to write the C# scripts that are used in Unity. Do you want to debug with Visual Studio?

  • @TGFanatic
    @TGFanatic 2 роки тому +1

    Hey so I am also planning to use Matterport and Vuforia target points for our AR navigation app for our university and I wanted to know how after generating the Matterport scan which I used my iPhone to scan. How can I import it to Unity I want to know the steps of importing my matterport scan into unity.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      Hi there, that sounds amazing! Check out the Vuforia docs, here it is explained perfectly: library.vuforia.com/develop-area-targets/area-targets-unity

    • @TGFanatic
      @TGFanatic 2 роки тому +1

      @@joshuadrewlow Thanks alot. However Iam seeing that I don't have the option to use the MatterPack maybe because I have used my iPhone to scan and not a 3D camera. I am still trying to understand how I can go about setting my environment like is there a way to do it without using matterport scans but rather a 3D model of the floor? Would be a great help to know. Thank you

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@TGFanatic Did you find a solution yet? You can only enable the MatterPack when you have scanned the area with an LiDAR supported iOS device. This information is written somewhere in the docs of Vuforia... Sorry if I haven't said that earlier. Best regards, Joshua

  • @dabinkim16
    @dabinkim16 2 роки тому +1

    Hi:) I‘ve tried to scan my room with my Iphone 6s, but why is it not possible to click the + button?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Hi D K, unfortunately it only works for devices with LiDAR. For iPhones this is after "iPhone 12 pro" or "iPhone pro max". LiDAR is also included on iPad Pro (2020 or later). Are you using the Area Target Creater or the Matterport app?

    • @dabinkim16
      @dabinkim16 2 роки тому +1

      @@joshuadrewlow Thanks for your answer. I’m using the Vuforia Area Target Creator… Is there any other way to scan rooms with the Iphone 6s?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@dabinkim16 Maybe for other 3D content creations but unfortunately not for creating Area Targets. ☹️

    • @dabinkim16
      @dabinkim16 2 роки тому +1

      @@joshuadrewlow Is it free for a student using the Vuforia Area Target Creator App with/and Unity3D?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      @@dabinkim16 Basically it's free yes. With a Vuforia developer license you can create 10 Area Targets, but you cannot publish app to public app store (but for beta store/Testflight it works). Make sure you ask at your school/university if they have an Vuforia Education license (only 3000.- per year for a specific amount of licences). There you have no limits. For Unity: The basic Unity is free until company revenue 200'000 per year. If you apply for Students programm you can even get Unity Pro.

  • @louanne5478
    @louanne5478 2 роки тому +1

    Hello! Amazing project, we are trying to achieve something similar but are having trouble setting an external position for our area target. Would you happen to know how to implement a setexternalposition method on an area target? I have got everything working, but it still wont pick up the vector I used for the external position.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      Hello there, unfortunately out of the box no. I would will have to try by myself. We didn't have time to work on the external position, but I will so in the future. Are you using the latest Vuforia Engine Version 10.3?

    • @louanne5478
      @louanne5478 2 роки тому +1

      @@joshuadrewlow We are using Vuforia 10.3.2. I will let you know if we figure it out, thank you!

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      Did you manage to implement Location Prior? And did you manage to also set the altitude?

  • @syahmctips2.071
    @syahmctips2.071 6 місяців тому

    can i use normal phone if i dont have the 3d camera? its seems expensive

    • @joshuadrewlow
      @joshuadrewlow  6 місяців тому

      It is expensive. Only recommend for real apps. Not if you create prototype or test things. Here you finde information, it requires an iPhone with LiDAR: developer.vuforia.com/library/environments/area-targets#prerequisites

  • @kathirarul4112
    @kathirarul4112 25 днів тому

    Bro I don't have I pad is possible to in laptop and lidar

    • @joshuadrewlow
      @joshuadrewlow  25 днів тому

      Hi, thanks for dropping by. No unfortunately it requires an iPad or iPhone with LiDAR support to scan the environment.

  • @yoto6730
    @yoto6730 2 роки тому +1

    can you make unity+ Vuforia tutorial for beginners please

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      I'm actually planning to create a paid tutorial for beginners and a template in Unity Asset Store next year. A tutorial for beginners is a lot of work because there are so many steps and things to consider. There are loads of tutorials on UA-cam that explain loads of stuff (With only two years of experience I'm also not an Unity Expert after all). Can I help you with something particular to get started? Cheers 🤓🤙

  • @gopalkrishna5194
    @gopalkrishna5194 2 роки тому +1

    Hi sir,
    Currently I am doing indoor navigation using AR for my college. I don't know how to start this project can you please explain this project.

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому

      Hi there, do you have a specific question or concern? I suggest you to watch the two tutorial videos I've already created, explaining this project. Check out the links in the descriptions to get started. In the beginning you probably need to watch and read loads of tutorials. That's how I started as well. Best regards, drewjosh

    • @srikripaas9997
      @srikripaas9997 2 роки тому +1

      Hey have you been able to do it ,i have to do the project as well can you tell me how you started..?

    • @joshuadrewlow
      @joshuadrewlow  2 роки тому +1

      @@srikripaas9997 Well yes, check out this playlist with the tutorials (Although it is done with Vuforia 9.7, with Vuforia 10 things are a bit different): ua-cam.com/play/PLcYgptwJHxUE7Q9ymWEX0zDs3SDgMarxu.html
      Did you manage to get started in the meantime?

    • @srikripaas9997
      @srikripaas9997 2 роки тому +1

      @@joshuadrewlow Thank you SO Much for replying!... Yes I did get started with it, but in a little different method!.. I'm actually still figuring out how to get started!... 😛😛

    • @abhijithkb10
      @abhijithkb10 Рік тому +1

      have you done it.If done can you guide me also?

  • @chiranthr3302
    @chiranthr3302 Рік тому

    Hello Sir
    Can I make use of zedmini for 3d scanning and put it to vuforia area target generator

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      Hello Sir, thank you for your comment. It looks like that this is not possible out of the box. I found some documentation on how to extract a 3D point cloud from a Zed Mini device (www.stereolabs.com/docs/depth-sensing/using-depth/#getting-point-cloud-data). Nevertheless, then you still need to convert it into .e57 file and I have no expertise in that. I also don't have time or money to buy one specifically to try this out. Have you tried scanning a room with an iPhone using the Vuforia Capture app?

  • @mohammadismail9000
    @mohammadismail9000 6 місяців тому

    Hello, how can i integrate 3d map generated from matterport to nevmesh?

    • @joshuadrewlow
      @joshuadrewlow  6 місяців тому

      What is the format of the 3d map? Do you already have an Area Target? For the navmesh you can take whatever 3d model. For the Area Target you need the MatterPak.

  • @NighTMare.001
    @NighTMare.001 3 місяці тому

    Hey I am seriously thinking about making this(something similar) as my final year computer science project, can you pls answer some simple questions for me?

    • @joshuadrewlow
      @joshuadrewlow  7 днів тому

      So sorry, I've just seen that I haven't replied to your comment yet. Is your project already over or did you find answers somewhere else? Cheers

  • @pratishthasingh6102
    @pratishthasingh6102 Рік тому

    hey, there!! facing issue in importing model from matterport, it's asking for subscription. can you please tell the alernative because I'm student and can't afford this amout.

    • @joshuadrewlow
      @joshuadrewlow  Рік тому

      Sure! All you need is a LiDAR supported iPhone or iPad and you can create Area Targets without using Matterport services (there are a few limits but Matterport I recommend using for production app). If you don't have one I'm sure someone at school has one or maybe the school can acquire one for the project. Often the schools have a budget that can be used for projects like this, sometimes you just have to ask! After creation you can copy them from the device and imported them into Unity. Cheers and let me know if I can help otherwise 😊

    • @pratishthasingh6102
      @pratishthasingh6102 Рік тому +1

      @@joshuadrewlow thank you so much!! 🥺You always help us out.