Announcing the NavAbility tutorials

Announcing the NavAbility Tutorials

We’re excited to announce the NavAbility tutorials, which demonstrate how real-world robot navigation challenges (like false loop-closures and distributing computations) are addressed using the NavAbility platform.

The tutorials are available as online Binder notebooks, short videos, and a GitHub repository that you can pull down if you want to run the examples locally. For the moment they are available in Python and Julia but we expect to release the Javascript versions soon!

The tutorials are designed to be:

    • Zero-footprint examples that should take about 15 minutes to take you from problem definition to results and analysis
    • Easily run in a browser in a JupyterHub Notebook but feel free to pull the code to your local machine
    • Quickly digested with our accompanying YouTube videos
    • Open to everyone without logins – all tutorials will always be available and can be run at any time using our guest account

Note: The cloud solving is done with a community-level version of our solver, so they may take a few moments to run. Reach out to us either by email or on Slack if you’d prefer to use a high-performance solver.

We’ll continue to grow the library of tutorials in the coming months. If you have any questions or suggestions please reach out to us at info@navability.io. Subscribe to our newsletter to keep up to date with new tutorials!

We’re updating the NavAbility Cloud!

We're updating the NavAbility Cloud!

In preparation for our tutorials at ICRA2022 (join us on May 27th in person or by remote on Gather.Town!) NavAbility is excited to announce the deployment of our first official production environment of NavAbility Cloud. Our mission is to enable more vibrant robotics automation and this is an important step in empowering our community with a reliable, stable, and secure NavAbility Cloud.

These changes will enable exciting upcoming features such as automated calibration, big data integration, and simplified user authentication. More to follow as we release these features!

Timeframe and Affected Services

The deployment will begin at 8:00pm PST on April 14, 2022. You may experience some instability of existing environments during the deployment as we cut various services over into production.

Once the deployment is complete, we will announce it in ‘general_help’ on the NavAbility Slack channel. After that announcement, you should switch to using NavAbility Cloud via the following URLs:
https://app.navability.io
https://api.navability.io

As a consequence of this deployment, we will be asking our existing users to recreate their user accounts. Please go to https://app.navability.io, click login and then sign up. If you need your existing data migrated to your new account, please email us at support@navability.io by May 13, 2022.

Thank you for your patience and understanding while we take this milestone step in our journey.

New YouTube video: Easily using camera data for navigation and localization

New YouTube video: Easily using camera data for navigation and localization

You have a camera on your robot… but now what?

We’re starting the discussion on how to convert camera data into information that can be used for navigation and localization.

Next up: Loop closures and why that’s the crux of a robust navigation solution!

References:
* Edwin Olson’s AprilTags paper: https://april.eecs.umich.edu/media/pdfs/olson2011tags.pdf
* C/C++ library: https://github.com/AprilRobotics/apriltag
* A Python library: https://github.com/duckietown/lib-dt-apriltags
* Julia library: https://github.com/JuliaRobotics/AprilTags.jl

We’re all learning here, so please feel free to comment about other good wrappers of the AprilTags library!

NavAbility SDKs Released!

NavAbility SDKs Released!

NavAbility SDKs are available for Python, Julia, and Javascript!
BETA

At NavAbility we’re excited to make robot navigation available to all! From enterprise commercial companies through to high-school hobbyists, our software lets you build out a complete robot navigation system for any application.

Navigation AI – also called a SLAM system, perception, or simply mapping – is the critical technology blocking robots from working in the real world. Read more here if you’d like to know how we’re solving this challenge.

With this in mind we’re excited to announce our Python, Julia, and Javascript SDKs!

These can be found on our website or here:

These are in development and are the first release. A few notes for users:

    • Expect that these will change as we introduce new features, so we recommend following NavAbility news on LinkedIn or joining the Slack channel to keep up to date.
    • We recommend using the “Guest” user for testing (which is open and accessible to all), but if you would like a user account of your own, feel free to reach out to info@navability.io.
    • We’re working on prioritizing features, so please feel free to write issues against the GitHub repositories if you would like specific functionality prioritized.

If you have any questions or feedback, please reach out to us at help@navability.io!

New YouTube video: Factor graphs and their importance in robotics

New YouTube video: Factor graphs and their importance in robotics

We promised to have a conversation on all things robotics, and a great place to start that conversation is on factor graphs. This is the topic of your second video on the NavAbility YouTube channel, which is embedded below.

We also love communication, so If you have a topic in mind please comment on the videos or email us at info@navability.io.

Application Example: Marine Vehicle Mapping Systems for Collision Avoidance and Planning

We’re all excited for the Jetson-like world with robotics in every aspect of life, solving a variety of challenging problems. But most are failing to leave safe, controlled lab spaces. Why?

The Imperfect Information Problem: Operating in the real-world requires robust solutions that can readily manage the chaos of dynamic environments and imperfect data. In the case of robotics, this is the deciding factor between a commercially-successful robot and a failure to launch.

Why Marine Vehicles: Coordinating marine vehicles is an ideal example of automation in a complex, dynamic environment. How do you ensure that your vehicles can safely track and operate around other ships, make the most of the variety of potentially-disagreeing sensors, and robustly handle the busy environment while completing critical tasks?

The NavAbility Case Study: At NavAbility we’re using data from MIT SeaGrant‘s REx/Philos vehicles to demonstrate how any robot can extract map information from multiple sensors, identify and track dynamic objects like ships, and use prior information to navigate effectively in a dynamic environment.

Continue reading

Announcing our YouTube channel on all things robotics!

Announcing our YouTube channel and Livestream on all things robotics!

We’re excited to announce our NavAbility YouTube channel on all things robotics!

We’ll dive into interesting topics about robots, sensors, navigation, and coordination – the “what to expect when you’re expecting a robot” for everyone from commercial users through to home hobbyists. Jim will also be doing a YouTube Live stream to discuss the last video, answer questions, and talk about industry news.

Subscribe the NavAbility YouTube channel to follow us as we release these discussions. We also love communication, so If you have a topic in mind please comment on the videos or email us at info@navability.io.

Conference News: Join us at ICRA 2022 in Philadelphia!

Join Us At ICRA 2022 in Philadelphia

Great news! We’re giving a tutorial workshop at the premier robotics conference, IEEE’s International Conference on Robotics and Automation (ICRA) in Philadelphia. 

Join us to learn why navigation robustness is critical in building effective robotics. This is a great opportunity to discuss real-world robotics problems such as the notorious “How do I handle uncertainties in my data?” question, i.e. multi-hypothesis and null-hypothesis data. We’ll also address emerging topics such as SLAM cloud computing and data persistence.

Find our Workshop Landing Page here.  See you in Philadelphia May 23rd – 27th!

More to follow! Keep up to date by following us on LinkedIn, join the discussion on Slack, or learn more about our Technology .

Coming soon: Marine surface vehicle example using radar for localization

One of our immediate aims is to demonstrate the NavAbility technology in real-world applications, and in this vein we’d like to give a preview of results from an exciting project.

This example is applicability to a variety of use-cases, from docking marine vehicles through to navigation of industrial indoor robots… And yes, self-driving too.

In the coming weeks we will release this as a complete working example. In the meantime, here is a short overview and preliminary results.

Seagrant REx at the MIT pavilion with Njord submersible hanging from center

REx and Radar Data

The MITSea Grant Remote Explorer (REx), shown above, is an autonomous marine surface vehicle used for diverse research applications . The vehicle is equipped with a variety of sensors, however the focus of this project is to use only the radar data for localization as it entered the Boston harbor. 

Dehann Fourie (CEO) constructed a GPS-denied navigation solution using the NavAbility stack, processing raw ROS radar payloads (shown below) into a complete factor graph to calculate a SLAM solution.

The example demonstrates how specialized data types can be created to capture and process radar information, which is simple to do using the Caesar.jl stack. A preliminary result of the solved trajectory is shown below.

Raw radar sweep extracted from ROS dataset
The solved trajectory (light-blue) of the vehicle as it travels into the harbor (poses highlighted).

Complete application example to follow soon

Keep an eye out for the complete example in the coming weeks!

Intrigued? Imagine what we can do with your robotics application! Please reach out to us at info@navability.io if you’re curious or have any questions.