Open Position

Hi Guys,

Here this Great Open Position in Zürich, at the ETH University to work together with  a Research Team in Digital Fabrication. (Building with Robots).

It is a very good salary and it consist in developing a Robotics Simulation Platform. The new Robotic Fabrication Laboratory is involved. http://gramaziokohler.arch.ethz.ch/web/e/forschung/186.html

 

Here is the full position:

https://apply.refline.ch/845721/4317/pub/1/index.html

 

 

Please, feel free to ask anything or to share with your contacts.

Cheers

 

-Augusto

 

<div>
<div class="WordSection1">
<p class="MsoNormal"><span lang="EN-US">Hi Guys,<p></p></span></p>
<p class="MsoNormal"><span lang="EN-US">Here this Great Open Position in Z&uuml;rich, at the ETH University to work together with &nbsp;a Research Team in Digital Fabrication. (Building with Robots).<p></p></span></p>
<p class="MsoNormal"><span lang="EN-US">It is a very good salary and it consist in developing a Robotics Simulation Platform. The new Robotic Fabrication Laboratory is involved.
<a href="http://gramaziokohler.arch.ethz.ch/web/e/forschung/186.html">http://gramaziokohler.arch.ethz.ch/web/e/forschung/186.html</a><p></p></span></p>
<p class="MsoNormal"><span lang="EN-US"><p>&nbsp;</p></span></p>
<p class="MsoNormal"><span lang="EN-US">Here is the full position:<p></p></span></p>
<p class="MsoNormal"><span lang="EN-US"><a href="https://apply.refline.ch/845721/4317/pub/1/index.html">https://apply.refline.ch/845721/4317/pub/1/index.html</a><p></p></span></p>
<p class="MsoNormal"><span lang="EN-US"><p>&nbsp;</p></span></p>
<p class="MsoNormal"><span lang="EN-US"><p>&nbsp;</p></span></p>
<p class="MsoNormal"><span lang="EN-US">Please, feel free to ask anything or to share with your contacts.<p></p></span></p>
<p class="MsoNormal"><span lang="EN-US">Cheers<p></p></span></p>
<p class="MsoNormal"><span lang="EN-US"><p>&nbsp;</p></span></p>
<p class="MsoNormal"><span lang="EN-US">-Augusto <p></p></span></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
</div>
</div>
Tully Foote via ros-users | 3 Feb 22:59 2016

Upcoming suspention of debian packaging for EOL Ubuntu distros

Hi Everyone, 

As some of you may have been aware we have always planned to drop support for EOL'd Ubuntu distros, however we have not done so due to our build farm not being able to disable the builds without breaking users operating from source.

As announced last week we have rolled out our new build farm. One of the new features of the new build farm is that we can now disable specific target platforms for building new packages without removing all references to the target platform. This allows users to keep using the older distros if they want to continue building from source. And as always we will continue to keep the existing debian packages available. The only difference is that we no longer will be releasing updates. 

For Indigo and Jade the Ubuntu distributions Saucy and Utopic have already reached their end of life. And Vivid's EOL is tomorrow. (See https://wiki.ubuntu.com/Releases for EOL dates.)  And as such we plan to stop building for those platforms by the end of this month. 

For maintainers we are staging a sync at the moment. We expect one final sync in the middle of February to be the final one before we turn off the builds for Saucy,  Utopic, and Vivid. Please keep that in mind as you plan your upcoming releases. 

In the future when other Ubuntu versions reach EOL we expect to stop building for them promptly after the EOL date since at that time Ubuntu will also no longer be providing updates as well. 

Tully
<div><div dir="ltr">Hi Everyone,&nbsp;<div><br></div>
<div>As some of you may have been aware we have always planned to drop support for EOL'd Ubuntu distros, however we have not done so due to our build farm not being able to disable the builds without breaking users operating from source.</div>
<div><br></div>
<div>As announced last week we have rolled out our new build farm. One of the new features of the new build farm is that we can now disable specific target platforms for building new packages without removing all references to the target platform. This allows users to keep using the older distros if they want to continue building from source. And as always we will continue to keep the existing debian packages available. The only difference is that we no longer will be releasing updates.&nbsp;</div>
<div><br></div>
<div>For Indigo and Jade the Ubuntu distributions Saucy and Utopic have already reached their end of life. And Vivid's EOL is tomorrow. (See&nbsp;<a href="https://wiki.ubuntu.com/Releases">https://wiki.ubuntu.com/Releases</a> for EOL dates.) &nbsp;And as such we plan to stop building for those platforms by the end of this month.&nbsp;</div>
<div><br></div>
<div>For maintainers we are staging a sync at the moment. We expect one final sync in the middle of February to be the final one before we turn off the builds for Saucy, &nbsp;Utopic, and Vivid. Please keep that in mind as you plan your upcoming releases.&nbsp;</div>
<div><br></div>
<div><div>In the future when other Ubuntu versions reach EOL we expect to stop building for them promptly after the EOL date since at that time Ubuntu will also no longer be providing updates as well.&nbsp;</div></div>
<div><br></div>
<div>Tully</div>
</div></div>

Announcement of ROS In-hand scanner

I'd like to announce my first ROS package - the ROS In-hand scanner. It is a 3D Scanning application based on the PCL In-hand scanner for small objects by Martin Sälzle and the PCL developers.

Until now it was only possible to use OpenNI based sensors. I implemented a standard ROS interface for PointCloud2 message type.

My first results using Intel RealSense can be seen in the following video:
https://drive.google.com/file/d/0B-EoR_OK9GCgcEw3V0l1ZnNPU2s/view

For the future it could be possible to create a RVIZ plugin, implement a global registration or generate watertight meshes for 3D printing applications.

For the moment have fun with scanning!
Patrick Wiesen
--
l
Patrick Wiesen B.Eng.
Fachbereich Maschinenbau und Mechatronik
Fachgebiet Automatisierungstechnik und Robotik
FH Aachen
University of Applied Sciences

Goethestr. 1 

52064 Aachen |  Germany


+49.241.6009 52508
+49.241.6009 52681

www.fh-aachen.de
wiesen-M0fD5G/vvvtn68oJJulU0Q@public.gmane.org
<div><div dir="ltr">I'd like to announce my first ROS package - the ROS In-hand scanner. It is a 3D Scanning application based on the PCL In-hand scanner for small objects by Martin S&auml;lzle and the PCL developers.<br><div><br clear="all"></div>
<div>Documentation can be found here:<br><a href="http://pointclouds.org/documentation/tutorials/in_hand_scanner.php">http://pointclouds.org/documentation/tutorials/in_hand_scanner.php</a><br><br>
</div>
<div>Until now it was only possible to use OpenNI based sensors. I implemented a standard ROS interface for PointCloud2 message type.<br><br>
</div>
<div>My first results using Intel RealSense can be seen in the following video:<br><a href="https://drive.google.com/file/d/0B-EoR_OK9GCgcEw3V0l1ZnNPU2s/view">https://drive.google.com/file/d/0B-EoR_OK9GCgcEw3V0l1ZnNPU2s/view</a><br><br>
</div>
<div>For the future it could be possible to create a RVIZ plugin, implement a global registration or generate watertight meshes for 3D printing applications.<br>
</div>
<div><br></div>
<div>For the moment have fun with scanning!<br>
</div>
<div>Patrick Wiesen<br>
</div>
<div>-- <br><div class="gmail_signature"><div dir="ltr"><div><div dir="ltr">
<div><span><span>l</span></span></div>
<div><span><span>l&nbsp;</span></span></div>
<div>
<span><span lang="fr">Patrick Wiesen&nbsp;B.Eng.</span></span><br><span lang="de"><span><span lang="fr"></span></span></span>
</div>
<div><span><span>Fachbereich Maschinenbau und Mechatronik</span></span></div>
<div><span><span>Fachgebiet Automatisierungstechnik und Robotik</span></span></div>
<div>
<span><span>FH Aachen</span><span><br>University of Applied Sciences</span><span><br>Goethestr. 1&nbsp;</span><span><br>52064 Aachen | &nbsp;Germany</span><span><br></span><span></span></span><br><span><span lang="en-US">T&nbsp;<a value="+49241600952497">+49.241.6009 52508</a><br>F&nbsp;<a value="+49241600952681">+49.241.6009 52681</a><br><br></span><a href="http://www.fh-aachen.de/" target="_blank"><span><span lang="en-US">www.fh-aachen.de</span><span lang="en-US"><br></span></span></a><span><span lang="en-US"><a href="mailto:wiesen <at> fh-aachen.de" target="_blank">wiesen@...</a></span></span></span>
</div>
</div></div></div></div>
</div>
</div></div>
Ricky Li via ros-users | 30 Jan 02:29 2016

[Jobs]Robotics Engineers for mobile robot at Amy Robotics

Amy Robotics is an innovative company focused on service robots that enhances quality of life through robotic technologies, products and services. 


Our team is based in Hangzhou, China. We are developing autonomous service robots for that assistant people in everyday living and work. We need some help to improve our development process and get our robot shipped soon. We are looking for multiple experienced roboticists to work on mobile robot navigation and computer vision for our service robots. 
  
Position 1: 
As a robotics engineer, you will be involved in designing, implementing and testing systems for mapping, planning and localisation, context awareness. Excellent candidate will lead the research and development of our robot navigation in its environments. We have many challenging problems and will give you independence and flexibility to address them to create a complete product experience. 

Qualifications: 

- Solid knowledge of mobile robot navigation theory. 
- Proficiency with C++, Python, and Linux 
- Hands on experience in ROS development in a Linux environment 
- Experience with robots and sensor systems in the real world 
- Experience in Android development is a big plus.   
- Ph.D. or MSc (with 3 year experience ), BSc (with 5 years experience) in robotics or related field 

Position 2: 

As a Machine Vision Scientist you will lead the research and development of our robot’s image-based understanding of its environment. Specifically you will be responsible for developing and testing tools and algorithms in areas such as: 

Detection and recognition of people and object 
Place recognition 
Motion detection (of things in the environment) while stationary and while driving 
Feature tracking 
Image stabilization 
Object detection and tracking 
Visual odometry etc. 

Qualifications: 
-Expert knowledge in C++ 
-Experience working with OpenCV 
-Experience applying machine learning to real-world vision problems 
-At least 3 years of experience designing, implementing, and tuning computer vision algorithms 
-M.S./Ph.D. or B.S. and 5 years experience in computer science or related fields 

Desirable skills: 
-Experience with deep learning algorithms or toolkits 
-Experience with sensor fusion or multimodal perception 
-Experience with embedded hardware development 
-Experience with Python 
-Experience with GPU computing. 
-Experience in Android development is a big plus.   

If you are interested in creating sophisticated robot, or building a company and have a strong desire to make difference in robot revolution, we would like to hear from you. 
Please submit resume, letter of motivation, and (a link to) any supporting materials (personal profile, open source contribution, project etc.) by email to Ricky Li : lirj
atamyrobot.cn 

<div><div dir="ltr">

<p class="MsoNormal"><span>Amy Robotics is an innovative
company focused on service robots that enhances quality of life through robotic
technologies, products and services.&nbsp;</span></p>
<p class="MsoNormal"><span>
<br><span>Our team is based in Hangzhou, China. We are
developing autonomous service robots for that assistant people in everyday
living and work. We need some help to improve our development process and get
our robot shipped soon. We are looking for multiple experienced roboticists to
work on mobile robot navigation and computer vision for our service robots.&nbsp;</span><br><span>&nbsp;&nbsp;</span><br><span>Position 1:&nbsp;</span><br><span>As a robotics engineer, you will be involved in
designing, implementing and testing systems for mapping, planning and
localisation, context awareness. Excellent candidate will lead the research and
development of our robot navigation in its environments. We have many
challenging problems and will give you independence and flexibility to address
them to create a complete product experience.&nbsp;</span><br><span><br></span></span></p>
<p class="MsoNormal"><span><span>Qualifications:&nbsp;</span><br><br><span>- Solid knowledge of mobile robot navigation
theory.&nbsp;</span><br><span>- Proficiency with C++, Python, and Linux&nbsp;</span><br><span>- Hands on experience in ROS development in a
Linux environment&nbsp;</span><br><span>- Experience with robots and sensor systems in
the real world&nbsp;</span><br><span>- Experience in Android development is a big
plus. &nbsp;&nbsp;</span><br><span>- Ph.D. or MSc (with 3 year experience ), BSc
(with 5 years experience) in robotics or related field&nbsp;</span><br><br><span>Position 2:&nbsp;</span><br><br><span>As a Machine Vision Scientist you will lead the
research and development of our robot&rsquo;s image-based understanding of its
environment. Specifically you will be responsible for developing and testing
tools and algorithms in areas such as:&nbsp;</span><br><br><span>Detection and recognition of people and object&nbsp;</span><br><span>Place recognition&nbsp;</span><br><span>Motion detection (of things in the environment)
while stationary and while driving&nbsp;</span><br><span>Feature tracking&nbsp;</span><br><span>Image stabilization&nbsp;</span><br><span>Object detection and tracking&nbsp;</span><br><span>Visual odometry etc.&nbsp;</span><br><br><span>Qualifications:&nbsp;</span><br><span>-Expert knowledge in C++&nbsp;</span><br><span>-Experience working with OpenCV&nbsp;</span><br><span>-Experience applying machine learning to
real-world vision problems&nbsp;</span><br><span>-At least 3 years of experience designing,
implementing, and tuning computer vision algorithms&nbsp;</span><br><span>-M.S./Ph.D. or B.S. and 5 years experience in
computer science or related fields&nbsp;</span><br><br><span>Desirable skills:&nbsp;</span><br><span>-Experience with deep learning algorithms or
toolkits&nbsp;</span><br><span>-Experience with sensor fusion or multimodal
perception&nbsp;</span><br><span>-Experience with embedded hardware development&nbsp;</span><br><span>-Experience with Python&nbsp;</span><br><span>-Experience with GPU computing.&nbsp;</span><br><span>-Experience in Android development is a big
plus. &nbsp;&nbsp;</span><br><br><span>If you are interested in creating
sophisticated robot, or building a company and have a strong desire to make
difference in robot revolution, we would like to hear from you.&nbsp;</span><br><span>Please submit resume, letter of motivation, and
(a link to) any supporting materials (personal profile, open source
contribution, project etc.) by email to Ricky Li : lirj</span></span><span>&#12304;</span><span>at</span><span>&#12305;</span><span><a href="http://amyrobot.cn/" target="_blank"><span>amyrobot.cn</span></a></span><span>&nbsp;</span><span><p></p></span></p>

 		 	   		  </div></div>
Ricky Li via ros-users | 30 Jan 02:13 2016

[jobs] Robotics Engineers for mobile robot at Amy Robotics


 

Robotics Engineers for mobile robot at Amy Robotics

 

Amy Robotics is an innovative company focused on service robots that enhances quality of life through robotic technologies, products and services. 

Our team is based in Hangzhou, China. We are developing autonomous service robots for that assistant people in everyday living and work. We need some help to improve our development process and get our robot shipped soon. We are looking for multiple experienced roboticists to work on mobile robot navigation and computer vision for our service robots. 
  

Position 1: 
As a robotics engineer, you will be involved in designing, implementing and testing systems for mapping, planning and localisation, context awareness. Excellent candidate will lead the research and development of our robot navigation in its environments. We have many challenging problems and will give you independence and flexibility to address them to create a complete product experience. 

Qualifications: 

- Solid knowledge of mobile robot navigation theory. 
- Proficiency with C++, Python, and Linux 
- Hands on experience in ROS development in a Linux environment 
- Experience with robots and sensor systems in the real world 
- Experience in Android development is a big plus.   
- Ph.D. or MSc (with 3 year experience ), BSc (with 5 years experience) in robotics or related field 

Position 2: 

As a Machine Vision Scientist you will lead the research and development of our robot’s image-based understanding of its environment. Specifically you will be responsible for developing and testing tools and algorithms in areas such as: 

Detection and recognition of people and object 
Place recognition 
Motion detection (of things in the environment) while stationary and while driving 
Feature tracking 
Image stabilization 
Object detection and tracking 
Visual odometry etc. 


Qualifications: 
-Expert knowledge in C++ 
-Experience working with OpenCV 
-Experience applying machine learning to real-world vision problems 
-At least 3 years of experience designing, implementing, and tuning computer vision algorithms 
-M.S./Ph.D. or B.S. and 5 years experience in computer science or related fields 

Desirable skills: 

-Experience with deep learning algorithms or toolkits 
-Experience with sensor fusion or multimodal perception 
-Experience with embedded hardware development 
-Experience with Python 
-Experience with GPU computing. 
-Experience in Android development is a big plus.   

  If you are interested in creating sophisticated robot, or building a company and have a strong desire to make difference in robot revolution, we would like to hear from you. 

Please submit resume, letter of motivation, and (a link to) any supporting materials (personal profile, open source contribution, project etc.) by email to Ricky Li : lirj
atamyrobot.cn 

 
 
<div><div dir="ltr">
<br><p class="MsoNormal"><span><p>&nbsp;</p></span></p>

<p class="MsoNormal"><span>Robotics Engineers for mobile
robot at Amy Robotics</span><span><p></p></span></p>

<p class="MsoNormal"><span><p>&nbsp;</p></span></p>

<p class="MsoNormal"><span>Amy Robotics is an innovative
company focused on service robots that enhances quality of life through robotic
technologies, products and services.&nbsp;</span><span><br><br><span>Our team is based in Hangzhou, China. We are
developing autonomous service robots for that assistant people in everyday
living and work. We need some help to improve our development process and get
our robot shipped soon. We are looking for multiple experienced roboticists to
work on mobile robot navigation and computer vision for our service robots.&nbsp;</span><br><span>&nbsp;&nbsp;</span><br><br><span>Position 1:&nbsp;</span><br><span>As a robotics engineer, you will be involved in
designing, implementing and testing systems for mapping, planning and
localisation, context awareness. Excellent candidate will lead the research and
development of our robot navigation in its environments. We have many
challenging problems and will give you independence and flexibility to address
them to create a complete product experience.&nbsp;</span><br><br><span>Qualifications:&nbsp;</span><br><br><span>- Solid knowledge of mobile robot navigation
theory.&nbsp;</span><br><span>- Proficiency with C++, Python, and Linux&nbsp;</span><br><span>- Hands on experience in ROS development in a
Linux environment&nbsp;</span><br><span>- Experience with robots and sensor systems in
the real world&nbsp;</span><br><span>- Experience in Android development is a big
plus. &nbsp;&nbsp;</span><br><span>- Ph.D. or MSc (with 3 year experience ), BSc
(with 5 years experience) in robotics or related field&nbsp;</span><br><br><span>Position 2:&nbsp;</span><br><br><span>As a Machine Vision Scientist you will lead the
research and development of our robot&rsquo;s image-based understanding of its
environment. Specifically you will be responsible for developing and testing
tools and algorithms in areas such as:&nbsp;</span><br><br><span>Detection and recognition of people and object&nbsp;</span><br><span>Place recognition&nbsp;</span><br><span>Motion detection (of things in the environment)
while stationary and while driving&nbsp;</span><br><span>Feature tracking&nbsp;</span><br><span>Image stabilization&nbsp;</span><br><span>Object detection and tracking&nbsp;</span><br><span>Visual odometry etc.&nbsp;</span><br><br><br><span>Qualifications:&nbsp;</span><br><span>-Expert knowledge in C++&nbsp;</span><br><span>-Experience working with OpenCV&nbsp;</span><br><span>-Experience applying machine learning to
real-world vision problems&nbsp;</span><br><span>-At least 3 years of experience designing,
implementing, and tuning computer vision algorithms&nbsp;</span><br><span>-M.S./Ph.D. or B.S. and 5 years experience in
computer science or related fields&nbsp;</span><br><br><span>Desirable skills:&nbsp;</span><br><br><span>-Experience with deep learning algorithms or
toolkits&nbsp;</span><br><span>-Experience with sensor fusion or multimodal
perception&nbsp;</span><br><span>-Experience with embedded hardware development&nbsp;</span><br><span>-Experience with Python&nbsp;</span><br><span>-Experience with GPU computing.&nbsp;</span><br><span>-Experience in Android development is a big
plus. &nbsp;&nbsp;</span><br><br><span>&nbsp;&nbsp;If you are interested in creating
sophisticated robot, or building a company and have a strong desire to make
difference in robot revolution, we would like to hear from you.&nbsp;</span><br><br><span>Please submit resume, letter of motivation, and
(a link to) any supporting materials (personal profile, open source
contribution, project etc.) by email to Ricky Li : lirj</span></span><span>&#12304;</span><span>at</span><span>&#12305;</span><span><a href="http://amyrobot.cn/" target="_blank"><span>amyrobot.cn</span></a></span><span>&nbsp;</span><span><p></p></span></p>

&nbsp;<div>&nbsp;</div> 		 	   		  </div></div>
Tully Foote via ros-users | 27 Jan 03:03 2016

Next Generation ROS Build Farm Deployed

Hi Everyone,

We’re happy to announce that the ROS build farm has been upgraded and is now running at http://build.ros.org 

This is a rewritten version of the ROS build farm which is now documented at http://wiki.ros.org/buildfarm with the goal that companies or projects can run their own instance to leverage the power of the ROS buildfarm for their own projects. If you are interestedin finding out more please read through the documentation and join the SIG mailing list https://groups.google.com/d/forum/ros-sig-buildfarm 

Note, that this design is focused on running a large buildfarm and requires several computers and some configuration time. However if you are an individual developer there are several things that are available from the new deployment which will be of value to you. First we can now support Github Pull Request integration. Many of you have seen integration like this via Travis we will be able to support that for indexed repositories. Secondly we have methods to reproduce all the jobs on the buildfarm locally. This means that if you get a build failure on the farm you can run the same code path on your local machine for much faster debugging. Each of the job types has instructions for reproducing locally linked from here: https://github.com/ros-infrastructure/ros_buildfarm/blob/master/doc/index.rst 

We have also put in a lot of time to bring down the rate of false positive emails. Please pay attention to emails from the buildfarm. If you get an email which is a false positive please look for or open a ticket at https://github.com/ros-infrastructure/ros_buildfarm/issues 
In some cases the devel jobs passed before but fail now since the new build farm is stricter. E.g. it performs an install step which it didn’t do before. Also less packages are being installed by default so not explicitly declared dependencies might be missing and fail the build now.

The new buildfarm also now has the ability to do more fine grain control of builds and ask that you please remove any builds that are repeatedly failing and you do not plan to fix soon. E.g. since rosdistro has been updated to REP 143 (http://www.ros.org/reps/rep-0143.html ) it is now possible to list a repository as a source entry to make it available via `rosinstall_generator` but not generate a devel job.

Note, that some email deliveries are currently delayed. We’re working on improving the timeliness of our email delivery, but right now our server is being throttled for delivery. 

Also for the next few sync cycles our shadow-fixed uploads will be delayed as we move away from our previous host where we are still staging our uploads through a manual process.

We’d like to extend a thank you to everyone who’s helped us test this. Especially our friends at Bosch who deployed our first version.

Tully

PS: Some fun facts about the build farm for Indigo and Jade we have 25546 different jobs running on the farm to support all the different packages, repositories and build types on all the different architectures. 24909 of those builds are passing with 54 successfully building but failing one or more unit tests.  

<div><div dir="ltr">
<div>Hi Everyone,</div>
<div><br></div>
<div>We&rsquo;re happy to announce that the ROS build farm has been upgraded and is now running at <a href="http://build.ros.org">http://build.ros.org</a>&nbsp;</div>
<div><br></div>
<div>This is a rewritten version of the ROS build farm which is now documented at <a href="http://wiki.ros.org/buildfarm">http://wiki.ros.org/buildfarm</a> with the goal that companies or projects can run their own instance to leverage the power of the ROS buildfarm for their own projects. If you are interestedin finding out more please read through the documentation and join the SIG mailing list <a href="https://groups.google.com/d/forum/ros-sig-buildfarm">https://groups.google.com/d/forum/ros-sig-buildfarm</a>&nbsp;</div>
<div><br></div>
<div>Note, that this design is focused on running a large buildfarm and requires several computers and some configuration time. However if you are an individual developer there are several things that are available from the new deployment which will be of value to you. First we can now support Github Pull Request integration. Many of you have seen integration like this via Travis we will be able to support that for indexed repositories. Secondly we have methods to reproduce all the jobs on the buildfarm locally. This means that if you get a build failure on the farm you can run the same code path on your local machine for much faster debugging. Each of the job types has instructions for reproducing locally linked from here: <a href="https://github.com/ros-infrastructure/ros_buildfarm/blob/master/doc/index.rst">https://github.com/ros-infrastructure/ros_buildfarm/blob/master/doc/index.rst</a>&nbsp;</div>
<div><br></div>
<div>We have also put in a lot of time to bring down the rate of false positive emails. Please pay attention to emails from the buildfarm. If you get an email which is a false positive please look for or open a ticket at <a href="https://github.com/ros-infrastructure/ros_buildfarm/issues">https://github.com/ros-infrastructure/ros_buildfarm/issues</a>&nbsp;</div>
<div>In some cases the devel jobs passed before but fail now since the new build farm is stricter. E.g. it performs an install step which it didn&rsquo;t do before. Also less packages are being installed by default so not explicitly declared dependencies might be missing and fail the build now.</div>
<div><br></div>
<div>The new buildfarm also now has the ability to do more fine grain control of builds and ask that you please remove any builds that are repeatedly failing and you do not plan to fix soon. E.g. since rosdistro has been updated to REP 143 (<a href="http://www.ros.org/reps/rep-0143.html">http://www.ros.org/reps/rep-0143.html</a> ) it is now possible to list a repository as a source entry to make it available via `rosinstall_generator` but not generate a devel job.</div>
<div><br></div>
<div>Note, that some email deliveries are currently delayed. We&rsquo;re working on improving the timeliness of our email delivery, but right now our server is being throttled for delivery.&nbsp;</div>
<div><br></div>
<div>Also for the next few sync cycles our shadow-fixed uploads will be delayed as we move away from our previous host where we are still staging our uploads through a manual process.</div>
<div><br></div>
<div>We&rsquo;d like to extend a thank you to everyone who&rsquo;s helped us test this. Especially our friends at Bosch who deployed our first version.</div>
<div><br></div>
<div>Tully</div>
<div><br></div>
<div>PS: Some fun facts about the build farm for Indigo and Jade we have 25546 different jobs running on the farm to support all the different packages, repositories and build types on all the different architectures. 24909 of those builds are passing with 54 successfully building but failing one or more unit tests. &nbsp;</div>
<div><br></div>
</div></div>

Driverless Development Vehicle with ROS Interface

Choose either the Lincoln MKZ or Ford Fusion as a development vehicle.



Full control of
  • throttle
  • brakes
  • steering
  • shifting
  • turn signals
Read production sensor data such as
  • gyros
  • accelerometers
  • gps
  • wheel speeds
  • tire pressures

There are no visual indications that the production vehicle has been modified. All electronics and wiring are hidden.

For more information, contact info-jHSJLH6ihCxgQP+SxMQ5Sg@public.gmane.org

http://wiki.ros.org/Robots/ADAS_Development_Vehicle_Kit
http://dataspeedinc.com/docs/ADAS_Kit.pdf
https://www.youtube.com/watch?v=-jMt8Mg27q0
<div>
    Choose either the Lincoln MKZ or Ford Fusion as a development
    vehicle.<br><br><br><br>
    Full control of<br><ul>
<li>throttle</li>
      <li>brakes</li>
      <li>steering</li>
      <li>shifting</li>
      <li>turn signals</li>
    </ul>
    Read production sensor data such as<br><ul>
<li>gyros</li>
      <li>accelerometers</li>
      <li>gps</li>
      <li>wheel speeds</li>
      <li>tire pressures</li>
    </ul>
<br>
    There are no visual indications that the production vehicle has been
    modified. All electronics and wiring are hidden.<br><br>
    For more information, contact <a class="moz-txt-link-abbreviated" href="mailto:info@...">info@...</a><br><br><a class="moz-txt-link-freetext" href="http://wiki.ros.org/Robots/ADAS_Development_Vehicle_Kit">http://wiki.ros.org/Robots/ADAS_Development_Vehicle_Kit</a><br><a class="moz-txt-link-freetext" href="http://dataspeedinc.com/docs/ADAS_Kit.pdf">http://dataspeedinc.com/docs/ADAS_Kit.pdf</a><br><a class="moz-txt-link-freetext" href="https://www.youtube.com/watch?v=-jMt8Mg27q0">https://www.youtube.com/watch?v=-jMt8Mg27q0</a>
  </div>

Re: ROS Web Services - on GitHub

Hi

I just released my code of ROS Web Services on GitHub.

Example with Turtlesim can be easily tested, but the example of the video illustration  on this link will require some configuration as it relies on the rbx1 package of ROS By Examples book. 
The readme file presents some more hints on the steps.

Based on our experience, working with rosjava was tedious and unfortunately, the easiness of the Java language cannot be fully utilized considering the difficulty of setting up ROS applications with ROSJAVA.
Just creating messages in ROS and making them available to ROSJAVA was a tedious process.

This is related to Hydro version, and not sure if Indigo brought better comfort with ROSJAVA.

To make better use of a ROS Web services, we plan to port them to Python considering that it is more naturally supported by ROS.

Just to give my feedback on this to the community, and open for discussions.

Thanks
Anis

--------------------------------------------
Anis Koubaa, PhD., 
Senior Fellow of the HEA (SFHEA)
ACM Chapter Chair (Saudi Arabia) 
Associate Professor
Prince Sultan University, Saudi Arabia
http://www.dei.isep.ipp.pt/~akoubaa
http://wiki.coins-lab.org
--------------------------------------------


From: "Anis Koubaa (COINS) via ros-users" <ros-users-g9ZBwUv/Ih8gsBAKwltoeQ@public.gmane.org>
Reply-To: Anis Koubaa <akoubaa-Y8BVf8hS/mQqPzAEMfFD0A@public.gmane.org>, User discussions <ros-users-g9ZBwUv/Ih8gsBAKwltoeQ@public.gmane.org>
Date: Friday, December 18, 2015 at 8:42 PM
To: <ros-users-g9ZBwUv/Ih8gsBAKwltoeQ@public.gmane.org>
Subject: [ros-users] ROS Web Services

Hello,

I have developed ROS Web Services to provide new abstractions to ROS using SOAP and REST Web services. The objective was to provide an additional software abstraction layer on top of ROS to allow a seamless interaction with ROS even for non-roboticians. We can say that ROS Web services is another alternative to rosbridge and rosjs which use the Web to interact with ROS. 

Using ROS Web services layer allow any developer with no background on robotics to develop Web service client (SOAP or REST) to monitor and control ROS-enabled robot through simple interfaces.  

A paper presenting ROS Web services  is published in The Journal of Software Engineering for Robotics. In the paper, I present an object-oriented design of software meta-models for the integration of both Web services into ROS and we validate it through a real implementation on a service robot. Implementation was performed using ROSJAVA under Hyrdo version.

A video illustration is also available on this link. A brief description is also available here

I still did not release the code due to lack of time, but should post it on GITHUB soon. Any comment on this will be welcome. 
We are working now on extending ROS Web services with new features and we plan to apply them in ROS-enabled drones.

Thanks
Anis

--------------------------------------------
Anis Koubaa, PhD., 
Senior Fellow of the HEA (SFHEA)
ACM Chapter Chair (Saudi Arabia) 
Associate Professor
Prince Sultan University, Saudi Arabia
--------------------------------------------

_______________________________________________ ros-users mailing list ros-users-g9ZBwUv/Ih8gsBAKwltoeQ@public.gmane.org http://lists.ros.org/mailman/listinfo/ros-users
<div>
<div><div>
<div>Hi</div>
<div><br></div>
<div>I just released my code of <a href="http://joser.unibg.it/index.php?journal=joser&amp;page=article&amp;op=view&amp;path%5B%5D=97&amp;path%5B%5D=30">ROS Web Services</a> on GitHub.</div>
<div><a href="https://github.com/aniskoubaa/ROSWebServices">https://github.com/aniskoubaa/ROSWebServices</a></div>
<div><br></div>
<div>Example with Turtlesim can be easily tested, but the example of the video illustration &nbsp;<a href="https://www.youtube.com/watch?v=WvjY5XjAX7U">on this link</a>&nbsp;will require some configuration as it relies on the rbx1 package of ROS By Examples book.&nbsp;</div>
<div>The readme file presents some more hints on the steps.</div>
<div><br></div>
<div>Based on our experience, working with rosjava was tedious and unfortunately, the easiness of the Java language cannot be fully utilized considering the difficulty of setting up ROS applications with ROSJAVA.</div>
<div>Just creating messages in ROS and making them available to ROSJAVA was a tedious process.</div>
<div><br></div>
<div>This is related to Hydro version, and not sure if Indigo brought better comfort with ROSJAVA.</div>
<div><br></div>
<div>To make better use of a ROS Web services, we plan to port them to Python considering that it is more naturally supported by ROS.</div>
<div><br></div>
<div>Just to give my feedback on this to the community, and open for discussions.</div>
<div><br></div>
<div>Thanks</div>
<div>Anis</div>
<div><br></div>
<div>
<div><div>
<div>--------------------------------------------</div>
<div>Anis Koubaa, PhD.,&nbsp;</div>
<div>Senior Fellow of the HEA (SFHEA)</div>
<div>ACM Chapter Chair (Saudi Arabia)&nbsp;</div>
<div>Associate Professor</div>
<div>Prince Sultan University, Saudi Arabia</div>
<div>http://www.dei.isep.ipp.pt/~akoubaa</div>
<div>http://wiki.coins-lab.org</div>
<div>--------------------------------------------</div>
</div></div>
<div><br></div>
</div>
</div></div>
<div><br></div>
<span><div>
<span>From: </span> "Anis Koubaa (COINS) via ros-users" &lt;<a href="mailto:ros-users@...">ros-users@...</a>&gt;<br><span>Reply-To: </span> Anis Koubaa &lt;<a href="mailto:akoubaa@...">akoubaa@...</a>&gt;, User discussions &lt;<a href="mailto:ros-users@...">ros-users@...</a>&gt;<br><span>Date: </span> Friday, December 18, 2015 at 8:42 PM<br><span>To: </span> &lt;<a href="mailto:ros-users@...">ros-users@...</a>&gt;<br><span>Subject: </span> [ros-users] ROS Web Services<br>
</div>
<div><br></div>
<div><div>
<div>Hello,</div>
<div><br></div>
<div>I have developed ROS Web Services to provide new abstractions to ROS using SOAP and REST Web services. The objective was to provide an additional software abstraction layer on top of ROS to allow a seamless interaction with ROS even for non-roboticians. We can say that ROS Web services is another alternative to rosbridge and rosjs which use the Web to interact with ROS.&nbsp;</div>
<div><br></div>
<div>Using ROS Web services layer allow any developer with no background on robotics to develop Web service client (SOAP or REST) to monitor and control ROS-enabled robot through simple interfaces. &nbsp;</div>
<div><br></div>
<div>A <a href="http://joser.unibg.it/index.php?journal=joser&amp;page=article&amp;op=view&amp;path%5B%5D=97"></a><a href="http://joser.unibg.it/index.php?journal=joser&amp;page=article&amp;op=view&amp;path%5B%5D=97&amp;path%5B%5D=30">paper presenting ROS Web services</a> &nbsp;is published in&nbsp;The Journal of Software Engineering for Robotics. In the paper, I present an object-oriented design of software meta-models for the integration of both Web services into ROS and we validate it through a real implementation on a service robot. Implementation was performed using ROSJAVA under Hyrdo version.</div>
<div><br></div>
<div>A video illustration is also available <a href="https://www.youtube.com/watch?v=WvjY5XjAX7U">on this link</a>. A brief description is also available <a href="http://wiki.coins-lab.org/index.php?title=Mybot">here</a>.&nbsp;</div>
<div><br></div>
<div>I still did not release the code due to lack of time, but should post it on GITHUB soon. Any comment on this will be welcome.&nbsp;</div>
<div>We are working now on extending ROS Web services with new features and we plan to apply them in ROS-enabled drones.</div>
<div><br></div>
<div>Thanks</div>
<div>Anis</div>
<div><br></div>
<div>
<div><div>
<div>--------------------------------------------</div>
<div>Anis Koubaa, PhD.,&nbsp;</div>
<div>Senior Fellow of the HEA (SFHEA)</div>
<div>ACM Chapter Chair (Saudi Arabia)&nbsp;</div>
<div>Associate Professor</div>
<div>Prince Sultan University, Saudi Arabia</div>
<div><a href="http://www.dei.isep.ipp.pt/~akoubaa">http://www.dei.isep.ipp.pt/~akoubaa</a></div>
<div><a href="http://wiki.coins-lab.org">http://wiki.coins-lab.org</a></div>
<div>--------------------------------------------</div>
</div></div>
<div><br></div>
</div>
</div></div>
_______________________________________________
ros-users mailing list
<a href="mailto:ros-users@...">ros-users@...</a>
<a href="http://lists.ros.org/mailman/listinfo/ros-users">http://lists.ros.org/mailman/listinfo/ros-users</a>
</span>
</div>

Open-source release: REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time

Dear colleagues,

We are happy to release an open source implementation of our approach 
for real-time, monocular, dense depth estimation, called "REMODE".

The code is available at: https://github.com/uzh-rpg/rpg_open_remode

It implements a "REgularized, probabilistic, MOnocular Depth 
Estimation", as described in the paper:

M. Pizzoli, C. Forster, D. Scaramuzza
REMODE: Probabilistic, monocular dense reconstruction in real time
IEEE International Conference on Robotics and Automation (ICRA), pp. 
2609-2616, 2014

The idea is to achieve real-time performance by combining Bayesian, 
per-pixel estimation with a fast regularization scheme that takes into 
account the measurement uncertainty to provide spatial regularity and 
mitigate the effect of noise.
Namely, a probabilistic depth measurement is carried out in real time 
for each pixel and the computed uncertainty is used to reject erroneous 
estimations and provide live feedback on the reconstruction progress.
The novelty of the regularization is that the estimated depth 
uncertainty from the per-pixel depth estimation is used to weight the 
smoothing.

Since it provides real-time, dense depth maps along with the 
corresponding confidence maps, REMODE is very suitable for robotic 
applications, such as environment interaction, motion planning, active 
vision and control, where both dense information and map uncertainty may 
be required.
More info here: http://rpg.ifi.uzh.ch/research_dense.html

The open source implementation requires a CUDA capable GPU and the 
NVIDIA CUDA Toolkit.
Instructions for building and running the code are available in the 
repository wiki.

Best regards,

Matia Pizzoli, Christian Forster, Davide Scaramuzza
Chen, Zaiyong via ros-users | 22 Jan 08:26 2016

[rosserial][endianess] Running rosserial_embeddedlinux on MIPS platform

Hi all,

 

Rosserial 0.7.1

Python 2.7.6

 

MIPS platform:

HelloRos example

 

ROS workstation(indigo):

rosrun rosserial_python serial_node.py tcp
casused:
Creation of publisher failed: unpack requires a string argument of length 4

 

https://github.com/ros-drivers/rosserial/issues/76

 

https://github.com/ros-drivers/rosserial/issues/109

 

Any comments are welcome.

 

Thanks,

Zaiyong Chen

 

<div>
<div class="WordSection1">
<p class="MsoNormal">Hi all, <p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal">Rosserial 0.7.1<p></p></p>
<p class="MsoNormal">Python 2.7.6<p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal">MIPS platform:<p></p></p>
<p class="MsoNormal">HelloRos example<p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal">ROS workstation(indigo):<p></p></p>
<p class="MsoNormal">rosrun rosserial_python serial_node.py tcp<br>
casused:<br>
Creation of publisher failed: unpack requires a string argument of length 4<p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal"><a href="https://github.com/ros-drivers/rosserial/issues/76">https://github.com/ros-drivers/rosserial/issues/76</a><p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal"><a href="https://github.com/ros-drivers/rosserial/issues/109">https://github.com/ros-drivers/rosserial/issues/109</a><p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal">Any comments are welcome. <p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal">Thanks,<p></p></p>
<p class="MsoNormal">Zaiyong Chen<p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
</div>
</div>

Call for Chapters: Springer Book on Robot Operating System (Volume 2)

Hello,

I am happy to announce the call for chapters for the Springer Book on Robot Operating System (ROS) Volume 2 is now open. 

The book will be published by Springer. 

We look forward to receiving your contributions to make this book successful and useful for ROS community. 

In Volume 1, we accepted 27 chapters ranging from beginners level to advanced level, including tutorials, case studies and research papers. The Volume 1 is expected to be released by Feb 2016.
After negotiation with Springer, the authors have benefited of around 80% of discount on hardcopies as an incentive to their contribution, in addition to publishing their work. 

The call for chapters website (see above) presents in detail the scope of the book, the different categories of chapters, topics of interest, and submission procedure. There are also Book Chapter Editing Guidelines that authors need to comply with. 

In this volume, we intend to make a special focus on unmanned aerial vehicle using ROS. Papers that present the design of a new drone and its integration with ROS, simulation environments of unmanned aerial vehicle with ROS and SITL, ground station to drone communication protocols (e.g. MAVLink, MAVROS, etc), control of unmanned aerial vehicles, best practices to work with drones, etc. are particularly sought.

In a nutshell, abstracts must be submitted by February 15, 2016 to register the chapters and to identify in advance any possible similarities of chapter contents. Full chapters submission is due on April 20, 2016.
Submissions and the review process will be handle through EasyChair. Link will be provided soon.

Each chapter will be reviewed by at least three expert reviewers, one at least should be a ROS user and/or developer. 

Want to be a reviewer for some chapters?
We look for the collaboration of ROS community users to provide reviews and feedback about proposals and chapters to be submitted for the book. If you are interested to participate in the review process, please consider filling in the following reviewer interest form

We look forward to receiving your contribution for a successful ROS reference!

Anis
--------------------------------------------
Anis Koubaa, PhD., 
Senior Fellow of the HEA (SFHEA)
ACM Chapter Chair (Saudi Arabia) 
Associate Professor
Prince Sultan University, Saudi Arabia
&
Research Associate
CISTER/INESC-TEC, ISEP, Portugal

http://www.dei.isep.ipp.pt/~akoubaa
http://wiki.coins-lab.org
--------------------------------------------


<div>
<div>
<div>Hello,</div>
<div><br></div>
<div>I am happy to&nbsp;announce&nbsp;the <a href="http://events.coins-lab.org/springer_ros_book_v2.html">call for chapters</a> for the Springer Book on Robot Operating System (ROS) Volume 2 is now open.&nbsp;</div>
<div>Website:&nbsp;<a href="http://events.coins-lab.org/springer_ros_book_v2.html">http://events.coins-lab.org/springer_ros_book_v2.html</a>
</div>
<div><br></div>
<div>The book will be published by Springer.&nbsp;</div>
<div><br></div>
<div>We look forward to receiving your contributions to make this book successful and useful for ROS community.&nbsp;</div>
<div><br></div>
<div>In <a href="http://www.springer.com/gp/book/9783319260525">Volume 1</a>, we accepted 27 chapters ranging from beginners level to advanced level, including tutorials, case studies and research papers. The&nbsp;<a href="http://www.springer.com/gp/book/9783319260525">Volume 1</a>&nbsp;is expected to be released by Feb 2016.</div>
<div>After&nbsp;negotiation&nbsp;with Springer, the authors have benefited of around 80% of discount on hardcopies as an incentive to their contribution, in addition to publishing their work.&nbsp;</div>
<div><br></div>
<div>The call for chapters website (see above) presents in detail the scope of the book, the different categories of chapters, topics of interest, and submission procedure. There are also <a href="http://events.coins-lab.org/springer/SpringerBookChapterGuidelines.htm">Book Chapter Editing Guidelines</a> that authors need to comply with.&nbsp;</div>
<div><br></div>
<div>In this volume, we intend to make a special focus on unmanned aerial vehicle using ROS. Papers that present the design of a new drone and its integration with ROS, simulation environments of unmanned aerial vehicle with ROS and SITL, ground station to drone communication protocols (e.g. MAVLink, MAVROS, etc), control of unmanned aerial vehicles, best practices to work with drones, etc. are particularly sought.</div>
<div><br></div>
<div>In a nutshell, abstracts must be submitted by February 15, 2016 to register the chapters and to identify in advance any possible similarities of chapter contents. Full chapters submission is due on April 20, 2016.</div>
<div>Submissions and the review process will be handle through EasyChair. Link will be provided soon.</div>
<div><br></div>
<div>Each chapter will be reviewed by at least three expert reviewers, one at least should be a ROS user and/or developer.&nbsp;</div>
<div><br></div>
<div>Want to be a reviewer for some chapters?</div>
<div>We look for the collaboration of ROS community users to provide reviews and feedback about proposals and chapters to be submitted for the book. If you are interested to participate in the review process, please consider filling in <a href="https://docs.google.com/forms/d/11jTIXw8FnKyTIWr_GxKtZ4p3QSaRi-3LiBFfzdx0Ulo/viewform">the following reviewer interest form</a>.&nbsp;</div>
<div><br></div>
<div>We look forward to receiving your contribution for a successful ROS reference!</div>
<div><br></div>
<div>Anis</div>
<div><span>--------------------------------------------</span></div>
<div>
<div>Anis Koubaa, PhD.,&nbsp;</div>
<div><span>Senior Fellow of the HEA (SFHEA)</span></div>
<div><div>ACM Chapter Chair (Saudi Arabia)&nbsp;</div></div>
<div><span>Associate Professor</span></div>
<div>Prince Sultan University, Saudi Arabia</div>
<div>&amp;</div>
<div>Research Associate</div>
<div>CISTER/INESC-TEC, ISEP, Portugal</div>
<div><span><br></span></div>
<div><span>http://www.dei.isep.ipp.pt/~akoubaa</span></div>
<div>http://wiki.coins-lab.org</div>
<div><span>--------------------------------------------</span></div>
</div>
<div><br></div>
</div>
<div><div><br></div></div>
</div>

Gmane