UTIAS / en 老司机直播 team takes top spot in self-driving car challenge for 6th time in 7 years /news/u-t-team-takes-top-spot-self-driving-car-challenge-6th-time-7-years <span class="field field--name-title field--type-string field--label-hidden">老司机直播 team takes top spot in self-driving car challenge for 6th time in 7 years</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=kFCXUnGZ 370w, /sites/default/files/styles/news_banner_740/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=YPAb6B8H 740w, /sites/default/files/styles/news_banner_1110/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=Q35dvO8b 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=kFCXUnGZ" alt="UofT's self driving car avoids a mock moose crossing the road"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-08-07T13:42:55-04:00" title="Wednesday, August 7, 2024 - 13:42" class="datetime">Wed, 08/07/2024 - 13:42</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>As part of the competition, the 老司机直播 team鈥檚 autonomous vehicle had to react to obstacles such as a fake deer moving across the road (photo courtesy of aUToronto)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">老司机直播</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">"Each time we saw an obstacle 鈥 a stop sign, a red light, the railroad bar coming down 鈥 and the car reacted by stopping and then continuing,鈥痺e let out a big cheer or a sigh of relief"</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team from the University of Toronto has placed first for sixth time in seven years in a North American self-driving car competition.&nbsp;</p> <p>After finishing in second place last year, <a href="https://www.autodrive.utoronto.ca">the aUToronto team</a> returned to the top spot at&nbsp;the <a href="https://www.autodrivechallenge.com" target="_blank">2024 SAE AutoDrive Challenge II</a>, which was held in June at the Mcity Test Facility in Ann Arbor, Mich.</p> <p>The aUToronto team competed against nine other teams from across Canada and the United States.</p> <p>鈥淭hrough the AutoDrive Challenge, we are preparing the next generation of engineers to head into the industry, to keep pushing towards the challenging goal of reaching Level 4 autonomous driving,鈥 says&nbsp;<strong>Tim Barfoot</strong>, a professor at the 老司机直播 Institute for Aerospace Studies (UTIAS) in the Faculty of Applied Science &amp; Engineering and one of the team鈥檚 academic advisers.&nbsp;&nbsp;&nbsp;</p> <p>鈥淭he team did another excellent job this year.鈥&nbsp;&nbsp;</p> <p>The team approached the competition by going back to first principles to ensure they had a reliable and robust system, says&nbsp;<strong>Kelvin Cui</strong>, a 老司机直播 Engineering alumnus and&nbsp;the team鈥檚 principal.&nbsp;&nbsp;</p> <p>He joined aUToronto last fall after five years with the University of Toronto Formula Racing team, where he founded the 鈥渄riverless鈥 division.&nbsp;&nbsp;&nbsp;</p> <p>鈥淲e looked at what was going to get us the most points at competition and made sure that we were not overbuilding our system and adding too much complexity,鈥 he says.&nbsp;&nbsp;</p> <p>This meant pushing for additional testing time at UTIAS and achieving more than 900 kilometres of system testing prior to the competition.&nbsp;&nbsp;&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-08/AUTODRIVE_24_5334-crop.jpg?itok=xSJviMQl" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>The team placed first out of 10 teams from institutions across the United States and Canada (photo courtesy of aUToronto)</em></figcaption> </figure> <p>A partnership with the AutoDrive team from Queen鈥檚 University was instrumental to aUToronto鈥檚 preparation. The aUToronto team drove Artemis, their autonomous vehicle, to Kingston, Ont. to assess the system at Queen鈥檚 testing facility, which features intersections and electronic streetlights.&nbsp;&nbsp;</p> <p>鈥淲e added radar to our vehicle as a new sensor, so we needed to be aware of all the sensor failure modes,鈥 says third-year Engineering Science student <strong>Robert Ren</strong>.&nbsp;&nbsp;</p> <p>鈥淎 lot of our testing time went into making sure that including radar didn鈥檛 break anything else in our system, and that it could handle any sensor failure cases.鈥&nbsp;&nbsp;</p> <p>Including radar sensors in the vehicle鈥檚 perception system&nbsp;allowed it to measure the motion of objects directly, which is not possible with light detection and ranging (LiDAR) sensors.鈥&nbsp;&nbsp;</p> <p>鈥淩adar can help with adverse weather object detections,鈥 adds Ren.&nbsp;鈥淪o, if the vehicle is operating under heavy rain or fog, the LiDAR is going to be limited, but the radio waves from radar can help the vehicle see what objects are in front and what objects are moving. This enables it to make good decisions when driving in uncertain scenarios.鈥濃&nbsp;&nbsp;</p> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="14" style=" background:#FFF; border:0; border-radius:3px; box-shadow:0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width:540px; min-width:326px; padding:0; width:99.375%; width:-webkit-calc(100% - 2px); width:calc(100% - 2px);"> <div style="padding:16px;"> <div style=" display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;">&nbsp;</div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;">&nbsp;</div> </div> </div> <div style="padding: 19% 0;">&nbsp;</div> <div style="display:block; height:50px; margin:0 auto 12px; width:50px;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank"><svg height="50px" version="1.1" viewBox="0 0 60 60" width="50px" xmlns:xlink="https://www.w3.org/1999/xlink"><g fill="none" fill-rule="evenodd" stroke="none" stroke-width="1"><g fill="#000000" transform="translate(-511.000000, -20.000000)"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></a></div> <div style="padding-top: 8px;"> <div style=" color:#3897f0; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:550; line-height:18px;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank">View this post on Instagram</a></div> </div> <div style="padding: 12.5% 0;">&nbsp;</div> <div style="display: flex; flex-direction: row; margin-bottom: 14px; align-items: center;"> <div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(0px) translateY(7px);">&nbsp;</div> <div style="background-color: #F4F4F4; height: 12.5px; transform: rotate(-45deg) translateX(3px) translateY(1px); width: 12.5px; flex-grow: 0; margin-right: 14px; margin-left: 2px;">&nbsp;</div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(9px) translateY(-18px);">&nbsp;</div> </div> <div style="margin-left: 8px;"> <div style=" background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 20px; width: 20px;">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 2px solid transparent; border-left: 6px solid #f4f4f4; border-bottom: 2px solid transparent; transform: translateX(16px) translateY(-4px) rotate(30deg)">&nbsp;</div> </div> <div style="margin-left: auto;"> <div style=" width: 0px; border-top: 8px solid #F4F4F4; border-right: 8px solid transparent; transform: translateY(16px);">&nbsp;</div> <div style=" background-color: #F4F4F4; flex-grow: 0; height: 12px; width: 16px; transform: translateY(-4px);">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 8px solid #F4F4F4; border-left: 8px solid transparent; transform: translateY(-4px) translateX(8px);">&nbsp;</div> </div> </div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center; margin-bottom: 24px;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 224px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 144px;">&nbsp;</div> </div> <p style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; line-height:17px; margin-bottom:0; margin-top:8px; overflow:hidden; padding:8px 0 7px; text-align:center; text-overflow:ellipsis; white-space:nowrap;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px; text-decoration:none;" target="_blank">A post shared by aUToronto (@autoronto_uoft)</a></p> </div> </blockquote> <script async src="//www.instagram.com/embed.js"></script> <p>In an event where both LiDAR and radar sensors fail, the aUToronto system can still rely on visual cameras to perform object tracking.鈥疶his made the team鈥檚 object tracker much more robust compared to last year when the team&nbsp;experienced sensor failure during a dynamic event.&nbsp;</p> <p><strong>Brian Cheong</strong>, a 老司机直播 Engineering master鈥檚 student who has been a member of aUToronto since 2021, acted as technical director of the autonomy team this year 鈥&nbsp;part of a new leadership structure introduced by Cui.&nbsp;鈥&nbsp;</p> <p>鈥淚n the past, it was a lot of work for our team鈥檚 principal to keep track of all the systems,鈥 Cheong says.&nbsp;鈥淪o instead of having to work directly with all 15 sub teams, Kelvin created groups of sub teams that we called stacks, and each stack had a director.鈥&nbsp;&nbsp;</p> <p>The restructuring and technical innovations paid off, with aUToronto completing its first clean sweep in the AutoDrive Challenge II, placing first in all static and&nbsp;dynamic events, including the concept design presentation and intersection challenge.&nbsp;&nbsp;</p> <p>鈥淭he intersection challenge was a big highlight for us,鈥 says Cheong. 鈥淜elvin and Robert were in the car, and I was on the sidelines watching with the rest of the team.&nbsp;Each time we saw an obstacle 鈥 a stop sign, a red light, the railroad bar coming down 鈥 and the car reacted by stopping and then continuing,鈥痺e let out a big cheer or a sigh of relief.&nbsp;&nbsp;&nbsp;</p> <p>鈥淎nd then we were all silent as the car approached the final obstacle, which was a deer.鈥疻e watched as Artemis slowed down to a stop and the deer moved by. Then we screamed and cheered, and we could hear cheering from inside the car.鈥&nbsp;&nbsp;</p> <p>鈥淥ur success is entirely a team effort,鈥 adds Cui. 鈥淚t was not smooth sailing before the competition. The only reason we won is because everybody put in so much effort to test our vehicle every day.</p> <p>鈥淭hat鈥檚 how we were able to get this reliable system across the line.鈥濃&nbsp;&nbsp;</p> <div> <div class="field field--name-field-media-oembed-video field--type-string field--label-hidden field__item"><iframe src="/media/oembed?url=https%3A//youtu.be/gG7DG-t2aiQ%3Fsi%3DkYGqZF0-x-6a4MBn&amp;max_width=0&amp;max_height=0&amp;hash=6whKFK-X5NSAGZdfMqSydpcgBMCmEPw2x-2wTgtl2jw" width="200" height="113" class="media-oembed-content" loading="eager" title="AutoDrive Challenge II Year 3 Highlight Video"></iframe> </div> </div> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 07 Aug 2024 17:42:55 +0000 Christopher.Sorensen 308926 at Start@UTIAS /node/308585 <span class="field field--name-title field--type-string field--label-hidden">Start@UTIAS</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>laurie.bulchak</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-07-25T21:33:52-04:00" title="Thursday, July 25, 2024 - 21:33" class="datetime">Thu, 07/25/2024 - 21:33</time> </span> <div class="field field--name-field-url field--type-string field--label-above"> <div class="field__label">URL</div> <div class="field__item">https://www.utias.utoronto.ca/startutias-entrepreneurship-program/</div> </div> <div class="field field--name-field-tags field--type-entity-reference field--label-above clearfix"> <h3 class="field__label">Tags</h3> <ul class="links field__items"> <li><a href="/news/tags/institute-aerospace-studies" hreflang="en">Institute for Aerospace Studies</a></li> <li><a href="/news/tags/utias" hreflang="en">UTIAS</a></li> </ul> </div> <div class="field field--name-field-campus field--type-entity-reference field--label-above"> <div class="field__label">Campus</div> <div class="field__item"><a href="/taxonomy/term/7034" hreflang="en">Off Campus</a></div> </div> Fri, 26 Jul 2024 01:33:52 +0000 laurie.bulchak 308585 at 老司机直播 researchers enhance object-tracking abilities of self-driving cars /news/u-t-researchers-enhance-object-tracking-abilities-self-driving-cars <span class="field field--name-title field--type-string field--label-hidden">老司机直播 researchers enhance object-tracking abilities of self-driving cars</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx 370w, /sites/default/files/styles/news_banner_740/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=VS33Oojz 740w, /sites/default/files/styles/news_banner_1110/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=lwAIt_Pp 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-05-29T10:59:42-04:00" title="Wednesday, May 29, 2024 - 10:59" class="datetime">Wed, 05/29/2024 - 10:59</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>Sandro Papais, a PhD student, is the co-author of a new paper that introduces a graph-based optimization method to improve object tracking for self-driving cars&nbsp;(photo courtesy of aUToronto)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">The new tools could help robotic systems of autonomous vehicles better track the position and motion of vehicles, pedestrians and cyclists<br> </div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at the University of Toronto Institute for Aerospace Studies (UTIAS) have introduced a pair of high-tech tools that could improve the safety and reliability of autonomous vehicles by enhancing the reasoning ability of their robotic systems.</p> <p>The innovations address multi-object tracking, a process used by robotic systems to track the position and motion of objects 鈥 including vehicles, pedestrians and cyclists 鈥 to plan the path of self-driving cars in densely populated areas.</p> <p>Tracking information is collected from computer vision sensors鈥(2D camera images and 3D LIDAR scans) and filtered at each time stamp, 10 times a second, to predict the future movement of moving objects.&nbsp;&nbsp;</p> <p>鈥淥nce processed, it allows the robot to develop some reasoning about its environment. For example, there is a human&nbsp;crossing the street at the intersection, or a cyclist changing lanes up ahead,鈥 says&nbsp;<strong>Sandro Papais</strong>, a PhD student in UTIAS in the Faculty of Applied Science &amp; Engineering. "At each time stamp, the robot鈥檚 software tries to link the current detections with objects it saw in the past, but it can only go back so far in time.鈥&nbsp;</p> <p><a href="https://arxiv.org/pdf/2402.17892">In a new paper</a> presented at the 2024 International Conference on Robotics and Automation in Yokohama, Japan, Papais and co-authors <strong>Robert Ren</strong>, a third-year engineering science student, and Professor <strong>Steven Waslander</strong>, director of UTIAS鈥檚 <a href="https://www.trailab.utias.utoronto.ca/">Toronto Robotics and AI Laboratory</a>, introduce Sliding Window Tracker (SWTrack) 鈥 a graph-based optimization method that uses additional temporal information to prevent missed objects.</p> <p>The tool is designed to improve the performance of tracking methods, particularly when objects are occluded from the robot鈥檚 point of view.&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-05/Objects%20and%20Labels.jpg?itok=mTZFj1NL" width="750" height="426" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>A visualization of a nuScenes dataset used by the researchers. The image is a mosaic of the six different camera views around the car with the object bounding boxes rendered overtop of the images (image courtesy of the Toronto Robotics and AI Laboratory)</em></figcaption> </figure> <p>&nbsp;</p> <p>鈥淪WTrack widens how far into the past a robot considers when planning,鈥 says Papais. 鈥淪o instead of being limited by what it just saw one frame ago and what is happening now, it can look over the past five seconds and then try to reason through all the different things it has seen.鈥濃&nbsp;&nbsp;</p> <p>The team tested, trained and validated their algorithm on field data obtained through nuScenes, a public, large-scale dataset for autonomous driving vehicles that have operated on roads in cities around the world. The data includes human annotations that the team used to benchmark the performance of SWTrack.&nbsp;&nbsp;</p> <p>They found that each time they extended the temporal window, to a maximum of five seconds, the tracking performance got better. But past five seconds, the algorithm鈥檚 performance was slowed by computation time.&nbsp;&nbsp;&nbsp;</p> <p>鈥淢ost tracking algorithms would have a tough time reasoning over some of these temporal gaps. But in our case, we were able to validate that we can track over these longer periods of time and maintain more consistent tracking for dynamic objects around us,鈥 says Papais.&nbsp;</p> <p>Papais says he鈥檚 looking forward to building on the idea of improving robot memory and extending it to other areas of robotics infrastructure.&nbsp;鈥淭his is just the beginning,鈥 he says. 鈥淲e鈥檙e working on the tracking problem, but also other robot problems, where we can incorporate more temporal information to enhance perception and robotic reasoning.鈥&nbsp;&nbsp;</p> <p>Another paper, <a href="https://arxiv.org/pdf/2402.12303">co-authored by master鈥檚 student <strong>Chang Won (John) Lee</strong> and Waslander</a>, introduces UncertaintyTrack, a collection of extensions for 2D tracking-by-detection methods that leverages probabilistic object detection.&nbsp;&nbsp;&nbsp;</p> <p>鈥淧robabilistic object detection quantifies the uncertainty estimates of object detection,鈥 explains Lee. 鈥淭he key thing here is that for safety-critical tasks, you want to be able to know when&nbsp;the predicted detections are likely to cause errors in downstream tasks such as multi-object tracking. These errors can occur because of low-lighting conditions or heavy object occlusion.&nbsp;&nbsp;</p> <p>鈥淯ncertainty estimates give us an idea of when the model is in doubt, that is, when it is highly likely to give errors in predictions.鈥疊ut鈥痶here鈥檚 this gap because probabilistic object detectors aren鈥檛 currently used in multi-tracking object tracking.鈥濃&nbsp;&nbsp;</p> <p>Lee worked on the paper as part of his undergraduate thesis in engineering science. Now a master鈥檚 student in Waslander鈥檚 lab, he is researching visual anomaly detection for the Canadarm3, Canada鈥檚 contribution to the U.S.-led Gateway lunar outpost.&nbsp;&nbsp;鈥淚n my current research, we are aiming to come up with a deep-learning-based method that detects objects floating in space that pose a potential risk to the robotic arm,鈥 Lee says.</p> <p>Waslander says the advancements outlined in the two papers build on work that his lab has been focusing on for a number of years.</p> <p>鈥淸The Toronto Robotics and AI Laboratory] has been working on assessing perception uncertainty and expanding temporal reasoning for robotics for multiple years now, as they are the key roadblocks to deploying robots in the open world more broadly,鈥 Waslander says.</p> <p>鈥淲e desperately need AI methods that can understand the persistence of objects over time, and ones that are aware of their own limitations and will stop and reason when something new or unexpected appears in their path. This is what our research aims to do.鈥&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 29 May 2024 14:59:42 +0000 rahul.kalvapalle 307958 at With the launch of its first satellite, student team charts a course to new knowledge /news/launch-its-first-satellite-student-team-charts-course-new-knowledge <span class="field field--name-title field--type-string field--label-hidden">With the launch of its first satellite, student team charts a course to new knowledge</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=9Wa3UXmZ 370w, /sites/default/files/styles/news_banner_740/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=JbfGqxc8 740w, /sites/default/files/styles/news_banner_1110/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=FhGwd94z 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=9Wa3UXmZ" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-01-19T12:44:03-05:00" title="Friday, January 19, 2024 - 12:44" class="datetime">Fri, 01/19/2024 - 12:44</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>A Falcon 9 rocket lifts off from Vandenberg Space Force Base on Nov. 11, 2023, carrying a satellite designed and built by the&nbsp;University of Toronto Aerospace Team (photo courtesy of SpaceX)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/aerospace" hreflang="en">Aerospace</a></div> <div class="field__item"><a href="/news/tags/electrical-computer-engineering" hreflang="en">Electrical &amp; Computer Engineering</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/mechanical-industrial-engineering" hreflang="en">Mechanical &amp; Industrial Engineering</a></div> <div class="field__item"><a href="/news/tags/space" hreflang="en">Space</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">鈥淲e worked on this project for so long with such a narrow focus that actually seeing it deployed was very rewarding鈥</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Students in the University of Toronto鈥檚 Faculty of Applied Science &amp; Engineering recently&nbsp;gathered in the basement of the Sandford Fleming Building 鈥 known to many as 鈥淭he Pit鈥 鈥 to witness the deployment of HERON Mk. II into space.&nbsp;&nbsp;</p> <p>The 3U CubeSat satellite, built and operated by the space systems division of the University of Toronto Aerospace Team (UTAT), was launched into orbit on a Falcon 9 rocket on Nov. 11, 2023 as part of SpaceX鈥檚 Transporter-9 rideshare mission that lifted off from the Vandenberg Space Force Base near Lompoc, Calif.&nbsp;&nbsp;</p> <p>The feat was entirely student funded with support from 老司机直播 Engineering through student levies and UTAT-led fundraising efforts.&nbsp;&nbsp;&nbsp;</p> <p>鈥淭he experience of the launch was very surreal,鈥&nbsp;says master鈥檚 degree student<strong>&nbsp;Benjamin Nero</strong>, HERON鈥檚 current mission manger.&nbsp;&nbsp;</p> <p>鈥淲e worked on this project for so long with such a narrow focus that actually seeing it deployed was very rewarding.鈥&nbsp;&nbsp;&nbsp;</p> <p>鈥淭here are any number of things that could go wrong that might prevent a satellite from deploying,鈥&nbsp;adds&nbsp;<strong>Zachary Teper</strong>, a fellow master鈥檚 degree candidate<strong>&nbsp;</strong>who is part of the technical development team working on HERON鈥檚 ground station.&nbsp;</p> <p>鈥淪o, watching each of the call outs coming out of the SpaceX mission control, seeing the rocket go up and meet every one of its mission objectives and then finally seeing our satellite get ejected out of the dispenser in the correct trajectory was a big relief&nbsp;鈥 because we knew that it was finally in space and on the right path.鈥&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/UTAT-Space-Systems-team-ground-station-crop.jpg?itok=fBLrHH7z" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Members of the UTAT space systems division gather on the sixth-floor roof of the Bahen Centre for Information Technology with the fully assembled ground station (photo by UTAT Space Systems)</em></figcaption> </figure> <p>Launching HERON 鈥 short for High frequency Educational Radio communications On a Nanosatellite 鈥&nbsp;was the culmination of years of teamwork that brought together the efforts of more than 100 students.&nbsp;</p> <p>HERON Mk. II, the second iteration of UTAT鈥檚 spacecraft, was originally designed and built between 2016 and 2018 for the fourth edition of the <a href="https://www.ic.gc.ca/eic/site/060.nsf/vwapj/CSDCMS.pdf/$file/CSDCMS.pdf">Canadian Satellite Design Challenge</a>.&nbsp;Since space systems division was formed in 2014, many of the students who worked on the initial HERON design and build have since graduated. But the current operations team continued to develop the satellite and renew the student levy that allowed them to secure their space launch.&nbsp;&nbsp;</p> <p>鈥淭he original objective for HERON was to conduct a biology experiment in space,鈥 says Nero, who joined the team in 2019 during his second year of undergraduate studies.&nbsp;鈥淏ut because of delays in the licensing process, we were unable to continue that mission objective. So, we re-scoped and shifted our focus to amateur radio communication and knowledge building.鈥&nbsp;&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/5-crop.jpg?itok=pLDFm8_s" width="750" height="422" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>From left to right: HERON Mk. I (2016), HERON Mk. II Prototype (2018), HERON Mk. II Softstack (2020), HERON Mk. II Flight Model (2021) (photos by UTAT Space Systems)</em></figcaption> </figure> <p>Once the satellite鈥檚 final assembly was completed in 2021, the team began flight model testing and assembling a ground station, while also managing the logistics of the regulatory approvals needed to complete the launch.&nbsp;&nbsp;</p> <p>鈥淚t鈥檚 difficult to put something in space, both technically and bureaucratically,鈥 says Nero. 鈥淭here are a lot of different governments that care about what you鈥檙e doing and want to know when and how you鈥檙e doing it.鈥&nbsp;</p> <p>Getting to space was a significant milestone for the team, but it鈥檚 still only the beginning of their work.&nbsp;</p> <p>鈥淭he goal for us as a design team is to start gathering institutional knowledge that we didn鈥檛 have before,鈥 says&nbsp;<strong>Reid Sox-Harris</strong>, an undergraduate student&nbsp;who is HERON鈥檚 ground station manager and the electrical lead for UTAT鈥檚 next space mission, FINCH&nbsp;(Field Imaging Nanosatellite for Crop residue Hyperspectral mapping).&nbsp;</p> <p>鈥淲e鈥檝e never operated a satellite. So, we鈥檙e taking a lot of lessons learned with us through this process.鈥濃&nbsp;&nbsp;</p> <p>For example, when a satellite is deployed for the first time, the ground control team only has a rough idea of its movement and eventual location. They must simulate the launch to figure out exactly where it is before they can establish a connection. And when they receive new positional data, they must rerun their simulation.&nbsp;&nbsp;</p> <p>鈥淲e have to take into account effects such as air resistance, or the sun鈥檚 solar cycles and the gravitational effects of the sun, the moon and the Earth 鈥 it鈥檚 a fairly complicated simulation,鈥 Sox-Harris says.&nbsp;<br> <br> Nero adds: 鈥淧art of the difficulty with a simulation is that a model is only useful for a certain period. An old estimate could result in as much as a few kilometres of drift from the satellite鈥檚 actual position per day.鈥</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/HERON-gs_937-crop.jpg?itok=FpwF15sA" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>HERON鈥檚 ground station on the roof of the Bahen Centre (photo by UTAT Space Systems)</em></figcaption> </figure> <p>The team was not only tasked with designing a ground station capable of communicating with a satellite more than 500 kilometres away, but one that can survive a frigid and snowy Canadian winter.</p> <p>鈥淔or any project, the most important thing you should be doing is testing,鈥 says second-year student&nbsp;<strong>Swarnava Ghosh</strong>, who primarily works on the ground station software.&nbsp;&nbsp;鈥淥ne challenge with our ground station currently is that there are too many variables that are not fully tested 鈥 and everything needs to be perfect in the chain for the communication to work. If the ground station is not pointing in the right direction, we won鈥檛 get a signal and we won鈥檛 establish communication. And if the amplifier is not working, then we won鈥檛 establish communication.鈥濃&nbsp;</p> <p>The team is confident that they will ultimately resolve any outstanding issues and establish communications with HERON. More importantly, they will be able to take what they鈥檝e learned and apply it to the next&nbsp;mission.</p> <p>鈥淲ith FINCH, we want to make sure the&nbsp;ground station software and satellite can communicate on the ground,鈥 says Sox-Harris. 鈥淩ight now, there are over 500 kilometres between the satellite and ground station, so we can鈥檛 fly up there and test whether a command has worked.鈥&nbsp;</p> <p>FINCH is set to launch in late 2025 on a rideshare rocket flight. Its&nbsp;current mission objective is to generate hyperspectral imaging maps of crop residue on farm fields in Manitoba from a low-Earth orbit.&nbsp;&nbsp;&nbsp;</p> <p>There are many technical developments that are new to FINCH that weren鈥檛 applicable to HERON, the team says, including a novel optic system for remote sensing that is being developed by students.&nbsp;&nbsp;</p> <p>鈥淭he risks associated with FINCH are mitigated by the work that is being performed by HERON right now.&nbsp;We鈥檙e learning many lessons that&nbsp;will be directly applicable to our next mission, and we鈥檒l continue to learn from HERON for at least another year or more,鈥 says Sox-Harris.&nbsp;</p> <p>鈥淭his means the FINCH mission can be more complicated, it can move faster and ultimately we can have better reliability, which is something that we always strive for in aerospace.鈥&nbsp;</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Fri, 19 Jan 2024 17:44:03 +0000 rahul.kalvapalle 305347 at 老司机直播 researchers partner with Siemens Energy to tackle sustainable energy production /news/u-t-researchers-partner-siemens-energy-tackle-sustainable-energy-production <span class="field field--name-title field--type-string field--label-hidden">老司机直播 researchers partner with Siemens Energy to tackle sustainable energy production</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=iaFYfLIx 370w, /sites/default/files/styles/news_banner_740/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=RYJSYC_g 740w, /sites/default/files/styles/news_banner_1110/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=vqhA3Qfg 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=iaFYfLIx" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-01-10T14:50:56-05:00" title="Wednesday, January 10, 2024 - 14:50" class="datetime">Wed, 01/10/2024 - 14:50</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>PhD student Yazdan Naderzadeh (left) investigates flames with lasers in the Propulsion and Energy Conversion Lab at UTIAS (photo by Neil Ta)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/selah-katona" hreflang="en">Selah Katona</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/industry-partnerships" hreflang="en">Industry Partnerships</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/sustainability" hreflang="en">Sustainability</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">'Together, we hope to unravel the complexities of hydrogen combustion, paving the way for cleaner and more efficient engines'</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers in the University of Toronto鈥檚 Faculty of Applied Science &amp; Engineering have partnered with Siemens Energy to tackle a key challenge in the energy sector: sustainable energy conversion for propulsion and power generation 鈥 such as developing gas turbine engines that can run on sustainable energy sources like hydrogen.</p> <p>Led by Associate Professor&nbsp;<strong>Swetaprovo Chaudhuri</strong>&nbsp;from the 老司机直播 Institute of Aerospace Studies (UTIAS), the initiative aims to rethink traditional gas turbine engines to reduce carbon emissions from both aviation and land-based fuel consumption.&nbsp;&nbsp;</p> <p>Chaudhuri鈥檚 team is exploring hydrogen combustion as a viable option since it can be burned without producing carbon dioxide.</p> <p>However, the transition is not without its challenges. For one, hydrogen is a small, highly reactive molecule, causing flames to move five to ten times faster than those of natural gas. This makes existing combustors and engines that run on natural gas incapable of handling pure hydrogen.&nbsp;</p> <p>Another key challenge is the lack of infrastructure available to transport hydrogen in the way pipelines are used to move natural gas. Until such infrastructure is developed, Chaudhuri鈥檚 team is researching how to build reliable fuel-flex gas turbine engines that can work on both fuels.&nbsp;&nbsp;</p> <p>鈥淗ydrogen and natural gas are vastly different - it鈥檚 like comparing a Bugatti Veyron to a public bus in both speed and size,鈥 says Chaudhuri, who leads the Propulsion &amp; Energy Conversion Laboratory at UTIAS. 鈥淭he critical question is: 鈥榟ow can engines be designed to accommodate both fuels seamlessly?鈥欌&nbsp;</p> <p>The team is led by Chaudhuri in collaboration with Associate Professor <strong>Jeff Bergthorson</strong> at McGill University, Professor&nbsp;<strong>脡tienne Robert</strong>&nbsp;and Assistant Professor&nbsp;<strong>Bruno Savard</strong>&nbsp;at Polytechnique Montr茅al, <strong>Patrizio Vena</strong> at National Research Council Canada and engineers from Siemens Energy Canada in Montreal.</p> <p>The project&nbsp;received an Alliance Mission Grant from the Natural Sciences and Engineering Research Council (NSERC)&nbsp;to build a comprehensive understanding that will guide the creation of fuel-flex gas turbine engines.</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/MicrosoftTeams-image-%288%29-crop.jpg?itok=IkkOJvxr" width="750" height="501" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>PhD candidate Yazdan Naderzadeh (left) and master鈥檚 student Scott Watson from the Propulsion and Energy Conversion Lab work with a swirling hydrogen flame (photo Praful Kumar)</em></figcaption> </figure> <p>The researchers have constructed a model lab-scale combustor at the Propulsion and Energy Conversion Laboratory at UTIAS, to study the behaviour of natural gas and hydrogen flames within engines. These experiments aim to understand the intricacies of hydrogen combustion to establish engineering principles and guidelines for future engine development.&nbsp;&nbsp;</p> <p>While practical applications are on the horizon, the immediate goal is to establish a robust knowledge base that will be essential for designing engines that can efficiently and safely use hydrogen as a fuel source.&nbsp;&nbsp;</p> <p>鈥淐urrently, long-range aircrafts cannot, even theoretically, fly on batteries. We need to make significant strides towards combustion engines that use hydrogen or other carbon-neutral fuels to substantially reduce carbon emissions in these critical sectors,鈥 says Chaudhuri.&nbsp;</p> <p>In a different, stand-alone project, Chaudhuri and his research group are developing a self-decarbonizing combustor, which separates hydrogen and carbon from natural gas within the combustor. This process not only allows for hydrogen to be used for fuel but could also allow the carbon byproduct to be used to offset the additional cost associated with decarbonization.&nbsp;&nbsp;</p> <p>鈥淥ur collaboration with Siemens Energy marks an exciting synergy between academia and industry,鈥 says Chaudhuri. 鈥淪iemens Energy鈥檚 gas turbines for generating power have historically used natural gas, so this partnership represents a significant step towards a greener future.</p> <p>鈥淭ogether, we hope to unravel the complexities of hydrogen combustion, paving the way for cleaner and more efficient engines.鈥&nbsp;</p> <p>The development and commissioning of the fuel-flex combustor, capable of safely stabilizing both hydrogen and natural gas flames, presents numerous research opportunities for students.</p> <p><strong>Yazdan Naderzadeh</strong> and <strong>Scott Watson</strong>, a PhD candidate and master鈥檚 student respectively in Chaudhuri鈥檚 lab, are working on the project. 鈥淚 am so excited to work on the ongoing fuel-flex combustor project, addressing concerns related to clean emissions and compatibility with conventional gas turbine burners,鈥 says Naderzadeh. 鈥淭his endeavor allows for a thorough study and understanding of the challenges associated with hydrogen as a prospective fuel in the aviation industry and gas power plants.鈥</p> <h3><a href="https://bluedoor.utoronto.ca/">Learn more about industry partnerships at 老司机直播</a></h3> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 10 Jan 2024 19:50:56 +0000 Christopher.Sorensen 305214 at AI algorithm improves predictive models of complex dynamical systems /news/ai-algorithm-improves-predictive-models-complex-dynamical-systems <span class="field field--name-title field--type-string field--label-hidden">AI algorithm improves predictive models of complex dynamical systems</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Lt8It-CP 370w, /sites/default/files/styles/news_banner_740/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=b2uZAIrk 740w, /sites/default/files/styles/news_banner_1110/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Wdk23c30 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Lt8It-CP" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-11-15T09:20:37-05:00" title="Wednesday, November 15, 2023 - 09:20" class="datetime">Wed, 11/15/2023 - 09:20</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>From left to right: Professor Prasanth Nair and PhD student Kevin Course are the authors of a new paper in Nature that introduces&nbsp;a new machine learning algorithm that addresses the challenge of imperfect knowledge about system dynamics&nbsp;(supplied images)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">Developed by 老司机直播 researchers, the new approach could have applications ranging from predicting the performance of aircraft engines to forecasting climate change</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at&nbsp;the&nbsp;University of Toronto have made a significant step towards enabling reliable predictions of complex dynamical systems when there are many uncertainties in the available data or missing information.</p> <p>In <a href="https://www.nature.com/articles/s41586-023-06574-8">a&nbsp;recent paper published in&nbsp;<em>Nature</em></a>, <strong>Prasanth B. Nair</strong>, a professor at the 老司机直播 Institute&nbsp;of Aerospace Studies (UTIAS) in the Faculty of Applied Science &amp; Engineering, and UTIAS PhD candidate <strong>Kevin&nbsp;Course</strong>&nbsp;introduced a new machine learning algorithm that surmounts the real-world challenge of imperfect knowledge about system dynamics. The computer-based mathematical modelling approach is used for problem solving and better decision making in complex systems, where many components interact with each other.&nbsp;&nbsp;</p> <p>The researchers say the work could have numerous applications ranging from predicting the performance of aircraft engines to forecasting changes in global climate or the spread of viruses.&nbsp;&nbsp;</p> <p>鈥淔or the first time, we are able to apply state estimation to problems where we don鈥檛 know the governing equations, or the governing equations have a lot of missing terms,鈥 says Course, who is the paper鈥檚 first author.&nbsp;&nbsp;&nbsp;</p> <p>鈥淚n contrast to standard techniques, which usually require a state estimate to infer the governing equations and vice-versa, our method learns the missing terms in the mathematical model and a state estimate simultaneously.鈥&nbsp;&nbsp;</p> <p>State estimation, also known as data assimilation, refers to the process of combining observational data with computer models to estimate the current state of a system. Traditionally, it requires strong assumptions about the type of uncertainties that exist in a mathematical model.&nbsp;&nbsp;&nbsp;</p> <p>鈥淔or example, let鈥檚 say you have constructed a computer model that predicts the weather and at the same time, you have access to real-time data from weather stations providing actual temperature readings,鈥 says Nair. 鈥淒ue to the model鈥檚 inherent limitations and simplifications 鈥 which is often unavoidable when dealing with complex real-world systems 鈥 the model predictions may not match the actual observed temperature you are seeing.&nbsp;&nbsp;</p> <p>鈥淪tate estimation combines the model鈥檚 prediction with the actual observations to provide a corrected or better-calibrated estimate of the current temperature. It effectively assimilates the data into the model to correct its state.鈥&nbsp;&nbsp;</p> <p>However, it has been previously difficult to estimate the underlying state of complex dynamical systems in situations where the governing equations are completely or partially unknown. The new algorithm provides a rigorous statistical framework to address this long-standing problem.&nbsp;&nbsp;</p> <p>鈥淭his problem is akin to deciphering the 鈥榣aws鈥 that a system obeys without having explicit knowledge about them,鈥 says Nair, whose research group is developing algorithms for mathematical modelling of systems and phenomena that are encountered in various areas of engineering and science.&nbsp;&nbsp;</p> <p>A byproduct of Course and Nair鈥檚 algorithm is that it also helps to characterize missing terms or even the entirety of the governing equations, which determine how the values of unknown variables change when one or more of the known variables change.&nbsp;&nbsp;&nbsp;</p> <p>The main innovation underpinning the work is a reparametrization trick for stochastic variational inference with Markov Gaussian processes that enables an approximate Bayesian approach to solve such problems. This new development allows researchers to&nbsp;deduce the equations that govern the dynamics of complex systems and arrive at a state estimate using indirect and 鈥渘oisy鈥 measurements.&nbsp;&nbsp;</p> <p>鈥淥ur approach is computationally attractive since it leverages stochastic&nbsp;鈥 that is randomly determined&nbsp;鈥 approximations that can be efficiently computed in parallel and, in addition, it does not rely on computationally expensive forward solvers in training,鈥 says Course.&nbsp;&nbsp;&nbsp;</p> <p>While Course and Nair approached their research from a theoretical viewpoint, they were able to demonstrate practical impact by applying their algorithm to problems ranging from modelling fluid flow to predicting the motion of black holes.&nbsp;&nbsp;&nbsp;</p> <p>鈥淥ur work is relevant to several branches of sciences, engineering and finance as researchers from these fields often interact with systems where first-principles models are difficult to construct or existing models are insufficient to explain system behaviour,鈥 says Nair.&nbsp;&nbsp;</p> <p>鈥淲e believe this work will open the door for practitioners in these fields to better intuit the systems they study,鈥 adds Course. 鈥淓ven in situations where high-fidelity mathematical models are available, this work can be used for probabilistic model calibration and to discover missing physics in existing models.&nbsp;&nbsp;&nbsp;</p> <p>鈥淲e have also been able to successfully use our approach to efficiently train neural stochastic differential equations, which is a type of machine learning model that has shown promising performance for time-series datasets.鈥&nbsp;&nbsp;&nbsp;&nbsp;</p> <p>While the paper primarily addresses challenges in state estimation and governing equation discovery, the researchers say it provides a general groundwork for robust data-driven techniques in computational science and engineering.&nbsp;&nbsp;</p> <p>鈥淎s an example,&nbsp;our research group is currently using this framework to construct probabilistic reduced-order models of complex systems. We hope to鈥痚xpedite decision-making processes integral to the optimal design,鈥痮peration and control鈥痮f鈥痳eal-world systems,鈥 says Nair.&nbsp;&nbsp;&nbsp;</p> <p>鈥淎dditionally,鈥痺e are also studying how鈥痶he inference methods stemming from our research鈥痬ay鈥痮ffer deeper statistical insights into stochastic differential equation-based generative models that are now widely used in many artificial intelligence applications.鈥&nbsp;&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 15 Nov 2023 14:20:37 +0000 Christopher.Sorensen 304471 at Researchers help robots navigate crowded spaces with new visual perception method /news/researchers-help-robots-navigate-crowded-spaces-new-visual-perception-method <span class="field field--name-title field--type-string field--label-hidden">Researchers help robots navigate crowded spaces with new visual perception method</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F 370w, /sites/default/files/styles/news_banner_740/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=7k3rU_TC 740w, /sites/default/files/styles/news_banner_1110/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=mtI0yfdN 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F" alt="crowded downtown city street with many people walking across an intersection"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-11-09T15:10:52-05:00" title="Wednesday, November 9, 2022 - 15:10" class="datetime">Wed, 11/09/2022 - 15:10</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Researchers from the 老司机直播 Institute for Aerospace Studies have developed a system that improves how robots stitch together a set of images taken from a moving camera to build a 3D model of their environments (photo by iStock/LeoPatrizi)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">老司机直播</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers at the University of Toronto&nbsp;has found a way to enhance the visual perception of robotic systems by coupling two different types of neural networks.</p> <p>The innovation could help autonomous vehicles navigate busy streets or enable medical robots to work effectively in crowded hospital hallways.&nbsp;</p> <p>鈥淲hat tends to happen in our field is that when systems don鈥檛 perform as expected, the designers make the networks bigger 鈥 they add more parameters,鈥 says <strong>Jonathan Kelly</strong>, an assistant professor at the&nbsp;<a href="https://www.utias.utoronto.ca/">University of Toronto Institute for Aerospace Studies</a> in the Faculty of Applied Science &amp; Engineering.</p> <p>鈥淲hat we鈥檝e done instead is to carefully study how the pieces should fit together. Specifically, we investigated how two pieces of the motion estimation problem 鈥 accurate perception of depth and motion 鈥 can be joined together in a robust way.鈥&nbsp;&nbsp;</p> <p>Researchers in Kelly鈥檚&nbsp;<a href="https://starslab.ca/">Space and Terrestrial Autonomous Robotic Systems</a>&nbsp;lab aim to build reliable systems that can help humans accomplish a variety of tasks. For example, they鈥檝e designed&nbsp;<a href="https://news.engineering.utoronto.ca/wheelchairs-get-robotic-retrofit-become-self-driving/">an electric wheelchair that can automate some common tasks</a>&nbsp;such as navigating through doorways.&nbsp;&nbsp;</p> <p>More recently, they鈥檝e focused on techniques that will help robots move out of the carefully controlled environments in which they are commonly used today and into the less predictable world&nbsp;humans are accustomed to navigating.&nbsp;&nbsp;</p> <p>鈥淯ltimately, we are looking to develop situational awareness for highly dynamic environments where people operate, whether it鈥檚 a crowded hospital hallway, a busy public square&nbsp;or a city street full of traffic and pedestrians,鈥 says Kelly.&nbsp;&nbsp;</p> <p>One challenging problem that robots must solve in all of these spaces is known to the robotics community as 鈥渟tructure from motion.鈥 This is the process by which robots stitch together a set of images taken from a moving camera to build a 3D model of the environment they are in. The process is analogous to the way humans use their eyes to perceive the world around them.&nbsp;&nbsp;</p> <p>In today鈥檚 robotic systems, structure from motion is typically achieved in two steps, each of which uses different information from a set of monocular images. One is depth perception, which tells the robot how far away the objects in its field of vision are. The other, known as egomotion, describes the 3D movement of the robot in relation to its environment.&nbsp;</p> <p>鈥淎ny robot navigating within a space needs to know how far static and dynamic objects are in relation to itself, as well as how its motion changes a scene,鈥 says Kelly. 鈥淔or example, when a train moves along a track, a passenger looking out a window can observe that objects at a distance appear to move slowly, while objects nearby zoom past.鈥&nbsp;&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="500px" width="750px"><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen frameborder="0" height="500px" src="https://www.youtube.com/embed/8Oij81bEoH0" title="YouTube video player" width="750px"></iframe></div> <p>&nbsp;</p> <p>The challenge is that in many current systems, depth estimation is separated from motion estimation 鈥 there is no explicit sharing of information between the two neural networks. Joining depth and motion estimation together ensures that each&nbsp;is consistent with the other.&nbsp;&nbsp;&nbsp;</p> <p>鈥淭here are constraints on depth that are defined by motion, and there are constraints on motion that are defined by depth,鈥 says Kelly. 鈥淚f the system doesn鈥檛 couple these two neural network components, then&nbsp;the end result is an inaccurate estimate of where everything is in the world and where the robot is in relation.鈥&nbsp;</p> <p>In a recent study, two of Kelly鈥檚&nbsp;students 鈥&nbsp;<strong>Brandon Wagstaff</strong>, a PhD candidate, and former PhD student&nbsp;<strong>Valentin Peretroukhin</strong>&nbsp;鈥&nbsp;investigated and improved on existing structure from motion methods.&nbsp;</p> <p>Their new system makes the egomotion prediction a function of depth, increasing the system鈥檚 overall accuracy and reliability.&nbsp;<a href="https://www.youtube.com/watch?v=6QEDCooyUjE">They recently presented their work</a> at the International Conference on Intelligent Robots and Systems (IROS) in Kyoto, Japan.&nbsp;&nbsp;</p> <p>鈥淐ompared with existing learning-based methods, our new system was able to reduce the motion estimation error by approximately 50 per cent,鈥 says Wagstaff.&nbsp;&nbsp;</p> <p>鈥淭his improvement in motion estimation accuracy was demonstrated not only on data similar to that used to train the network, but also on significantly different forms of data, indicating that the proposed method was able to generalize across many different environments.鈥&nbsp;</p> <p>Maintaining accuracy when operating within novel environments is challenging for neural networks. The team has since expanded their research beyond visual motion estimation to include inertial sensing 鈥撯痑n extra sensor that is akin to the vestibular system in the human ear.&nbsp;&nbsp;</p> <p>鈥淲e are now working on robotic applications that can mimic a human鈥檚 eyes and inner ears, which provides information about balance, motion and acceleration,鈥 says Kelly.&nbsp;&nbsp;&nbsp;</p> <p>鈥淭his will enable even more accurate motion estimation to handle situations like dramatic scene changes 鈥 such as an environment suddenly getting darker when a car enters a tunnel, or a camera failing when it looks directly into the sun.鈥&nbsp;&nbsp;</p> <p>The potential applications for such new approaches are diverse, from improving the handling of self-driving vehicles to enabling aerial drones to fly safely through crowded environments to deliver goods or carry out environmental monitoring.&nbsp;&nbsp;</p> <p>鈥淲e are not building machines that are left in cages,鈥 says Kelly. 鈥淲e want to design robust robots that can move safely around people and environments.鈥&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 09 Nov 2022 20:10:52 +0000 Christopher.Sorensen 177980 at 老司机直播 student team takes first place at International Small Wind Turbine Contest /news/u-t-student-team-takes-first-place-international-small-wind-turbine-contest <span class="field field--name-title field--type-string field--label-hidden">老司机直播 student team takes first place at International Small Wind Turbine Contest</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=3vOUHy6F 370w, /sites/default/files/styles/news_banner_740/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=tBVfRCqT 740w, /sites/default/files/styles/news_banner_1110/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=4m2zo9jM 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=3vOUHy6F" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-07-04T11:24:48-04:00" title="Monday, July 4, 2022 - 11:24" class="datetime">Mon, 07/04/2022 - 11:24</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">The UTWind team stands next to their winning prototype turbine at the Open Jet Facility wind tunnel at Delft University of Technology (photo by Niels Adema/Hanze University of Applied Sciences)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/tyler-irving" hreflang="en">Tyler Irving</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/global-lens" hreflang="en">Global Lens</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">老司机直播</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/global" hreflang="en">Global</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/sustainability" hreflang="en">Sustainability</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>In their first-ever competition,&nbsp;UTWind&nbsp;鈥 <a href="https://www.utwind.com/">a team of undergraduate and graduate students</a> from the University of Toronto鈥檚 Faculty of Applied Science &amp; Engineering 鈥 has taken the top prize in an international challenge to design and build a small-scale wind turbine.</p> <p>鈥淲hile we always strived to be a competitive team from the beginning and knew that we had a strong design, we definitely didn鈥檛 expect to win first place,鈥 says&nbsp;<strong>David Petriw</strong>, a third-year materials science and engineering student who is&nbsp;a member of UTWind.</p> <p>鈥淭he morale of the team is at an all-time high, and we are going to celebrate this win in a big way.鈥</p> <p><a href="https://www.hanze.nl/nld/onderwijs/techniek/instituut-voor-engineering/organisatie/contest/international-small-wind-turbine-contest">The&nbsp;International Small Wind Turbine Contest</a> (ISWTC)&nbsp;is hosted annually by Hanze University of Applied Sciences in Groningen, Netherlands. To clinch first place, UTWind edged out teams from Denmark, Germany, Poland and Egypt.</p> <p>鈥淭he goal of ISWTC is to build and demonstrate a wind turbine designed for rural regions in Sub-Saharan Africa,鈥 says&nbsp;<strong>Andrew Ilersich</strong>, a<b>&nbsp;</b>PhD candidate at the 老司机直播 Institute for Aerospace Studies (UTIAS) and&nbsp;aerodynamics lead for UTWind.</p> <p>鈥淓very aspect of our design had to be tailored to, or at least compatible with, the region it would be sold and operated in. We also had to show that our design was sustainable, being made from recyclable, low-cost, and locally available materials.鈥</p> <p>Unlike the large turbines used in commercial wind farms, which can rise to over 100 metres and generate megawatts of power each, small wind turbines (SWTs) are designed for generation on scales from a few hundred watts to a few kilowatts.</p> <p>To win the contest, teams must demonstrate top-of-class performance across a number of criteria, including power generation, cut-in speed, estimated annual energy production and coefficient of power, which is a measure of the turbine鈥檚 efficiency.</p> <p>Performance was measured at the Open Jet Facility wind tunnel at Delft University of Technology. After that, the teams headed to the&nbsp;Science of Making Torque Conference&nbsp;in Delft, to present their business case.</p> <p>&nbsp;</p> <div class="media_embed" width="1px"> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="14" height style=" background:#FFF; border:0; border-radius:3px; box-shadow:0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width:540px; min-width:326px; padding:0; width:99.375%; width:-webkit-calc(100% - 2px); width:calc(100% - 2px);" width="1px"> <div style="padding:16px;"> <div style=" display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;">&nbsp;</div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;">&nbsp;</div> </div> </div> <div style="padding: 19% 0;">&nbsp;</div> <div style="display:block; height:50px; margin:0 auto 12px; width:50px;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank"><svg height="50px" version="1.1" viewBox="0 0 60 60" width="50px" xmlns:xlink="https://www.w3.org/1999/xlink"><g fill="none" fill-rule="evenodd" stroke="none" stroke-width="1"><g fill="#000000" transform="translate(-511.000000, -20.000000)"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></a></div> <div style="padding-top: 8px;"> <div style=" color:#3897f0; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:550; line-height:18px;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank">View this post on Instagram</a></div> </div> <div style="padding: 12.5% 0;">&nbsp;</div> <div style="display: flex; flex-direction: row; margin-bottom: 14px; align-items: center;"> <div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(0px) translateY(7px);">&nbsp;</div> <div style="background-color: #F4F4F4; height: 12.5px; transform: rotate(-45deg) translateX(3px) translateY(1px); width: 12.5px; flex-grow: 0; margin-right: 14px; margin-left: 2px;">&nbsp;</div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(9px) translateY(-18px);">&nbsp;</div> </div> <div style="margin-left: 8px;"> <div style=" background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 20px; width: 20px;">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 2px solid transparent; border-left: 6px solid #f4f4f4; border-bottom: 2px solid transparent; transform: translateX(16px) translateY(-4px) rotate(30deg)">&nbsp;</div> </div> <div style="margin-left: auto;"> <div style=" width: 0px; border-top: 8px solid #F4F4F4; border-right: 8px solid transparent; transform: translateY(16px);">&nbsp;</div> <div style=" background-color: #F4F4F4; flex-grow: 0; height: 12px; width: 16px; transform: translateY(-4px);">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 8px solid #F4F4F4; border-left: 8px solid transparent; transform: translateY(-4px) translateX(8px);">&nbsp;</div> </div> </div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center; margin-bottom: 24px;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 224px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 144px;">&nbsp;</div> </div> <p style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; line-height:17px; margin-bottom:0; margin-top:8px; overflow:hidden; padding:8px 0 7px; text-align:center; text-overflow:ellipsis; white-space:nowrap;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px; text-decoration:none;" target="_blank">A post shared by UTWind (@utwindclub)</a></p> </div> </blockquote> <script async height src="//www.instagram.com/embed.js" width="1px"></script></div> <p>&nbsp;</p> <p>The process of creating the prototype took more than a year from start to finish.</p> <p>鈥淲e began the design phase in the beginning of 2021 and the whole assembly was built in winter semester 2022,鈥 says&nbsp;<strong>Ashley Best</strong>, a third-year student in materials science and engineering who is<strong>&nbsp;</strong>media team lead for UTWind.</p> <p>鈥淥ur turbine is made from wood and 3D-printed plastics. A few parts were outsourced to our sponsoring machine shop, Protocase, but the majority of the fabrication was done in house by our team 鈥 3D printing, laser cutting, drill pressing, lathing, milling and assembly.鈥</p> <p>鈥淥ne of the things that set our team apart was our high coefficient of power, even when operating at very low wind speeds,鈥 says <strong>Suraj Bansal</strong>,<strong>&nbsp;</strong>UTWind co-president and technical adviser and a PhD candidate at UTIAS.</p> <p>鈥淚n addition, we had a very modular, low-cost and sustainable construction, as well as a self-starting wind-turbine design thanks to our active pitch control system. We are currently creating a mobile app to control and monitor the wind turbine performance right from our mobile devices.鈥</p> <p>UTWind is one of 老司机直播 Engineering鈥檚 newest design teams, co-founded in January 2021 by Bansal and UTIAS alumnus&nbsp;<strong>Ben Gibson</strong>.</p> <p>鈥淚 was a member of a similar wind turbine design team at the University of Manitoba, while Suraj had prior experience from his master鈥檚 research work in the U.S. to design extreme-scale wind turbines,鈥 Gibson says.</p> <p>鈥淲e wanted to pass as much of that knowledge on as we could, while both having fun and pushing ourselves to the maximum. And so far, it has worked out great.鈥</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Mon, 04 Jul 2022 15:24:48 +0000 Christopher.Sorensen 175486 at 老司机直播's aUToronto team wins first competition of AutoDrive Challenge sequel /news/u-t-s-autoronto-team-wins-first-competition-autodrive-challenge-sequel <span class="field field--name-title field--type-string field--label-hidden">老司机直播's aUToronto team wins first competition of AutoDrive Challenge sequel </span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=HFjmF2oB 370w, /sites/default/files/styles/news_banner_740/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=lX7IIV1K 740w, /sites/default/files/styles/news_banner_1110/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=p6ubdIaM 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=HFjmF2oB" alt="AutoDrive Challenge team"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-06-14T15:49:13-04:00" title="Tuesday, June 14, 2022 - 15:49" class="datetime">Tue, 06/14/2022 - 15:49</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">The aUToronto team, made up mostly of 老司机直播 undergraduate students, won the first phase of the AutoDrive Challenge II, which took place earlier this month in Ann Arbor, Mich. (photo courtesy aUToronto)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/tyler-irving" hreflang="en">Tyler Irving</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><a href="https://www.autodrive.utoronto.ca/">A self-driving vehicle team</a> from the University of Toronto鈥檚 Faculty of Applied Science &amp; Engineering has taken the top spot overall in the first competition of the four-year AutoDrive Challenge&nbsp;II.</p> <p>The achievement by aUToronto <a href="https://news.engineering.utoronto.ca/autodrive-challenge-u-of-t-engineering-places-first-for-the-fourth-straight-year/">continues an impressive winning streak for the team</a>, which&nbsp;consistently placed first throughout the original four-year AutoDrive Challenge.</p> <p>In the current contest, the intercollegiate competition鈥檚 original concept has been expanded with more teams and more sophisticated tasks as participants&nbsp;develop and demonstrate an autonomous vehicle (AV) that can navigate urban driving courses.</p> <p>鈥淭his year was a fresh new start for us,鈥 says&nbsp;<strong>Frank (Chude) Qian</strong>, a master鈥檚 candidate at the 老司机直播 Institute for Aerospace Studies (UTIAS) and&nbsp;team principal for aUToronto. 鈥淲e have a very young team, with nearly 90 per cent&nbsp;of the students new to the competition.鈥</p> <p>Approximately 85 per cent&nbsp;of these students are undergraduates from across 老司机直播 Engineering鈥檚 departments and divisions. The remainder are graduate students or undergraduates from other parts of 老司机直播, including the department of computer science in the Faculty of Arts &amp; Science.</p> <p><img alt src="/sites/default/files/PerceptionCart-crop.jpg" style="width: 750px; height: 500px;"></p> <p><em>Throughout the fall of 2021 and winter of 2022, the aUToronto team spent hours designing, training and testing their perception cart, pictured here at the 老司机直播 Institute for Aerospace Studies. (photo courtesy of&nbsp;aUToronto)</em></p> <p>A total of 10 institutions from across North America sent teams to AutoDrive Challenge&nbsp;II. They assembled earlier this month at Mcity in Ann Arbor, Mich., a unique purpose-built proving ground for testing the performance and safety of connected and automated vehicles.</p> <p>鈥淭his is the first year of the second round of the AutoDrive competition, so the team was required to design, build and code everything from scratch,鈥 says&nbsp;<strong>Steven Waslander</strong>,&nbsp;an associate professor at UTIAS who advises the team along with fellow faculty members <strong>Tim Barfoot</strong>,&nbsp;<strong>Jonathan Kelly</strong>&nbsp;and&nbsp;<strong>Angela Schoellig</strong>.</p> <p>鈥淚t was a monumental effort and one that shows the true depth of talent and dedication of all the members of this amazing group of students.鈥</p> <p>In the first phase of the four-year competition, the teams were using what are known as perception carts.</p> <p>鈥淲e use these to validate the design of our perception system, which we will incorporate onto a real vehicle for next year鈥檚 competition,鈥 says Qian. 鈥淥ur brand-new sensor suite is based on a new solid-state LiDAR modality.鈥</p> <p>LiDAR is a sensing technology that works in a similar way to radar, except that it uses laser light instead of sound. It is a key component of the sensor suite 鈥 which also includes traditional radar and visual cameras similar to those found in smartphones 鈥 that enables a self-driving vehicle to build up a 3D representation of its surroundings.</p> <p><img alt src="/sites/default/files/aUTorontoinAnnArbor-crop.jpg" style="width: 750px; height: 500px;"></p> <p><em>Members of the aUToronto team with their winning perception cart at MCity in Ann Arbor, Mich&nbsp;(photo courtesy of&nbsp;aUToronto)</em></p> <p>In addition to being declared the overall winner, the 老司机直播 Engineering team received top marks in a wide range of categories, including the concept design event, the traffic light challenge and mobility innovation.</p> <p>One of the most dramatic parts of the competition was the dynamic object detection challenge, during which the cart had to detect and avoid a mannequin of a deer.</p> <p>鈥淚n the final testing session, the team realized their segmentation code was failing due to a change to the deer mannequin being used,鈥 says Waslander.</p> <p>鈥淗aving planned for just such an event, they immediately switched to high gear. They took and labelled over 2,000 images of the new deer, then spent the whole night training and tweaking a brand-new detector. It worked brilliantly in competition the next day, securing first place.鈥</p> <p>Qian says he is very proud of all that the team has accomplished.</p> <p>鈥淭his year with so many newer members, training became very important,鈥 he says. 鈥淲e also lost some precious development time due to challenges associated with COVID-19. I cannot give enough credit to the aUToronto flight team members who really took one for the team and pulled through together.鈥</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 14 Jun 2022 19:49:13 +0000 Christopher.Sorensen 175243 at Researchers design 'socially aware' robots that can anticipate 鈥 and safely avoid 鈥 people on the move /news/researchers-design-socially-aware-robots-can-anticipate-and-safely-avoid-people-move <span class="field field--name-title field--type-string field--label-hidden">Researchers design 'socially aware' robots that can anticipate 鈥 and safely avoid 鈥 people on the move</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y 370w, /sites/default/files/styles/news_banner_740/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=POt2dsrM 740w, /sites/default/files/styles/news_banner_1110/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=weHgrGz7 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-05-17T12:54:39-04:00" title="Tuesday, May 17, 2022 - 12:54" class="datetime">Tue, 05/17/2022 - 12:54</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Hugues Thomas and his collaborators at the 老司机直播 Institute for Aerospace Studies created a new method for robot navigation based on self-supervised deep learning (photo by Safa Jinje)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/machine-learning" hreflang="en">machine learning</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers led by University of Toronto Professor&nbsp;<strong>Tim Barfoot&nbsp;</strong>is using a&nbsp;new strategy that allows robots to&nbsp;avoid colliding&nbsp;with people by predicting the future locations of dynamic obstacles in their path.&nbsp;</p> <p>The project, which is supported by&nbsp;Apple Machine Learning, will be presented at the International Conference on Robotics and Automation in Philadelphia at the end of May.</p> <p>The results from a simulation, which are not yet peer-reviewed,&nbsp;<a href="https://arxiv.org/abs/2108.10585">are available on the arXiv preprint service</a>.&nbsp;</p> <p>鈥淭he principle of our work is to have a robot predict what people are going to do in the immediate future,鈥 says&nbsp;<strong>Hugues Thomas</strong>, a post-doctoral researcher in Barfoot鈥檚 lab at the 老司机直播&nbsp;Institute for Aerospace Studies in Faculty of Applied Science &amp; Engineering. 鈥淭his allows the robot to anticipate the movement of people it encounters rather than react once confronted with those obstacles.鈥&nbsp;</p> <p>To decide where to move, the robot makes use of Spatiotemporal Occupancy Grid Maps (SOGM). These are 3D grid maps maintained in the robot鈥檚 processor, with each 2D grid cell containing predicted information about the activity in that space at a specific time.&nbsp;The robot choses its future actions by processing these maps through existing trajectory-planning algorithms.&nbsp;&nbsp;</p> <p>Another key tool used by the team is light detection and ranging (lidar), a remote sensing technology similar to radar&nbsp;except that it uses light instead of sound. Each ping&nbsp;of the lidar creates a point stored in the robot鈥檚 memory.&nbsp;Previous work by the team has focused on labeling these points based on their dynamic properties. This helps the robot recognize different types of objects within its surroundings.&nbsp;</p> <p>The team鈥檚 SOGM network is currently able to recognize four lidar point categories:&nbsp;the ground; permanent fixtures, such as walls; things that are moveable but motionless, such as chairs and tables; and dynamic obstacles, such as people. No human labelling of the data is needed.&nbsp;&nbsp;</p> <p>鈥淲ith this work, we hope to enable robots to navigate through crowded indoor spaces in a more socially aware manner,鈥 says Barfoot. 鈥淏y predicting where people and other objects will go, we can plan paths that anticipate what dynamic elements will do.鈥&nbsp;&nbsp;</p> <p>In the paper, the team reports successful results from the algorithm carried out in simulation. The next challenge is to show similar performance&nbsp;in real-world settings, where&nbsp;human actions can be difficult to predict. As part of this effort, the team has tested their design on the first floor of 老司机直播鈥檚 Myhal Centre for Engineering Innovation &amp; Entrepreneurship, where the robot was able to move past busy students.&nbsp;&nbsp;</p> <p>鈥淲hen we do experiment in simulation, we have agents that are encoded to a certain behaviour&nbsp;and they will go to a certain point by following the best trajectory to get there,鈥 says Thomas. 鈥淏ut that鈥檚 not what people do in real life.鈥&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="422px" width="750px"><iframe allow="autoplay" height="422px" src="https://drive.google.com/file/d/1wbq3lVdHZbU_4WSIz7-ArQN-g9fah-gL/preview" width="750px"></iframe></div> <p>&nbsp;</p> <p>When people move through spaces, they may hurry or stop abruptly to talk to someone else or turn in a completely different direction. To deal with this kind of behaviour,&nbsp;the network employs a machine learning technique known as self-supervised learning.&nbsp;&nbsp;</p> <p>Self-supervised learning contrasts with other machine-learning techniques, such as reinforced learning, where the algorithm learns to perform a task by maximizing a notion of reward in a trial-and-error manner. While this approach works well for some tasks 鈥 for example, a computer learning to play a game&nbsp;such as chess or Go 鈥 it is not ideal for this type of navigation.&nbsp;</p> <p>鈥淲ith reinforcement learning, you create a black box that makes it difficult to understand the connection between the input 鈥 what the robot sees 鈥 and the output, or the robot does,鈥 says Thomas. 鈥淚t would also require the robot to fail many times before it learns the right calls, and we didn鈥檛 want our robot to learn by crashing into people.鈥&nbsp;&nbsp;&nbsp;</p> <p>By contrast, self-supervised learning is simple and comprehensible, meaning that it鈥檚 easier to see how the robot is making its decisions. This approach is also point-centric rather than object-centric, which means the network has a closer interpretation of the raw sensor data, allowing for multimodal predictions.&nbsp;&nbsp;</p> <p>鈥淢any traditional methods detect people as individual objects and create trajectories for them.&nbsp;But since our model is point-centric, our algorithm does not quantify people as individual objects, but recognizes areas where people should be. And if you have a larger group of people, the area gets bigger,鈥 says Thomas.&nbsp;&nbsp;&nbsp;</p> <p>鈥淭his research offers a promising direction that&nbsp;could have positive implications in areas such as autonomous driving and robot delivery, where an environment is not entirely predictable.鈥&nbsp;&nbsp;</p> <p>In the future, the team wants to see if they can scale up their network to learn more subtle cues from dynamic elements in a scene.&nbsp;</p> <p>鈥淭his will take a lot more training data,鈥 says Barfoot. 鈥淏ut it should be possible because we鈥檝e set ourselves up to generate the data in more automatic way: where the robot can gather more data itself while navigating, train better predictive models when not in operation&nbsp;and then use these the next time it navigates a space.鈥&nbsp;&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 17 May 2022 16:54:39 +0000 Christopher.Sorensen 174762 at