Main MRPT website > C++ reference for MRPT 1.4.0
COpenNI2_RGBD360.h
Go to the documentation of this file.
1 /* +---------------------------------------------------------------------------+
2  | Mobile Robot Programming Toolkit (MRPT) |
3  | http://www.mrpt.org/ |
4  | |
5  | Copyright (c) 2005-2016, Individual contributors, see AUTHORS file |
6  | See: http://www.mrpt.org/Authors - All rights reserved. |
7  | Released under BSD License. See details in http://www.mrpt.org/License |
8  +---------------------------------------------------------------------------+ */
9 #ifndef mrpt_OPENNI2_RGBD360_H
10 #define mrpt_OPENNI2_RGBD360_H
11 
15 #include <mrpt/utils/TEnumType.h>
17 
19 
20 
21 namespace mrpt
22 {
23  namespace hwdrivers
24  {
25  /** A class for grabing RGBD images from several OpenNI2 sensors. This is used to obtain larger fields of view using a radial configuration of the sensors.
26  * The same options (resolution, fps, etc.) are used for every sensor.
27  *
28  * <h2>Configuration and usage:</h2> <hr>
29  * Data is returned as observations of type mrpt::obs::CObservationRGBD360.
30  * See those classes for documentation on their fields.
31  *
32  * As with any other CGenericSensor class, the normal sequence of methods to be called is:
33  * - CGenericSensor::loadConfig() - Or calls to the individual setXXX() to configure the sensor parameters.
34  * - COpenNI2_RGBD360::initialize() - to start the communication with the sensor.
35  * - call COpenNI2_RGBD360::getNextObservation() for getting the data.
36  *
37  * <h2>Calibration parameters</h2><hr>
38  * The reference system for both depth and RGB images provided by each individual OpenNI2 sensors are referred to the
39  * RGB Camera.
40  * The extrinsic parameters of each RGBD sensor are provided from a configuration file. This calibration was obtained
41  * using the method reported in [].
42  *
43  * <h2>Coordinates convention</h2><hr>
44  * The origin of coordinates is the focal point of the RGB camera of the first indexed sensor, with the axes oriented
45  * as in the diagram shown in mrpt::obs::CObservation3DRangeScan. Notice in that picture that the RGB camera is
46  * assumed to have axes as usual in computer vision, which differ from those for the depth camera.
47  *
48  * The X,Y,Z axes used to report the data from accelerometers coincide with those of the depth camera
49  * (e.g. the camera standing on a table would have an ACC_Z=-9.8m/s2).
50  *
51  * Notice however that, for consistency with stereo cameras, when loading the calibration parameters from
52  * a configuration file, the left-to-right pose increment is expected as if both RGB & IR cameras had
53  * their +Z axes pointing forward, +X to the right, +Y downwards (just like it's the standard in stereo cameras
54  * and in computer vision literature). In other words: the pose stored in this class uses a different
55  * axes convention for the depth camera than in a stereo camera, so when a pose L2R is loaded from a calibration file
56  * it's actually converted like:
57  *
58  * L2R(this class convention) = CPose3D(0,0,0,-90deg,0deg,-90deg) (+) L2R(in the config file)
59  *
60  *
61  * <h2>Some general comments</h2><hr>
62  * - Depth is grabbed in 10bit depth, and a range N it's converted to meters as: range(m) = 0.1236 * tan(N/2842.5 + 1.1863)
63  * - This sensor can be also used from within rawlog-grabber to grab datasets within a robot with more sensors.
64  * - There is no built-in threading support, so if you use this class manually (not with-in rawlog-grabber),
65  * the ideal would be to create a thread and continuously request data from that thread (see mrpt::system::createThread ).
66  * - The intensity channel default to the RGB images, but it can be changed with setVideoChannel() to read the IR camera images (useful for calibrating).
67  * - There is a built-in support for an optional preview of the data on a window, so you don't need to even worry on creating a window to show them.
68  * - This class relies on an embedded version of libfreenect (you do NOT need to install it in your system). Thanks guys for the great job!
69  *
70  * <h2>Converting to 3D point cloud </h2><hr>
71  * You can convert the 3D observation into a 3D point cloud with this piece of code:
72  *
73  * \code
74  * mrpt::obs::CObservationRGBD360 obs3D;
75  * mrpt::maps::CColouredPointsMap pntsMap;
76  * pntsMap.colorScheme.scheme = CColouredPointsMap::cmFromIntensityImage;
77  * pntsMap.loadFromRangeScan(obs3D);
78  * \endcode
79  *
80  * Then the point cloud mrpt::maps::CColouredPointsMap can be converted into an OpenGL object for
81  * rendering with mrpt::maps::CMetricMap::getAs3DObject() or alternatively with:
82  *
83  * \code
84  * mrpt::opengl::CPointCloudColouredPtr gl_points = mrpt::opengl::CPointCloudColoured::Create();
85  * gl_points->loadFromPointsMap(&pntsMap);
86  * \endcode
87  *
88  *
89  * <h2>Platform-specific comments</h2><hr>
90  * For more details, refer to <a href="http://openkinect.org/wiki/Main_Page" >libfreenect</a> documentation:
91  * - Linux: You'll need root privileges to access Kinect. Or, install <code> MRPT/scripts/51-kinect.rules </code> in <code>/etc/udev/rules.d/</code> to allow access to all users.
92  * - Windows:
93  * - Since MRPT 0.9.4 you'll only need to install <a href="http://sourceforge.net/projects/libusb-win32/files/libusb-win32-releases/" >libusb-win32</a>: download and extract the latest libusb-win32-bin-x.x.x.x.zip
94  * - To install the drivers, read this: http://openkinect.org/wiki/Getting_Started#Windows
95  * - MacOS: (write me!)
96  *
97  *
98  * <h2>Format of parameters for loading from a .ini file</h2><hr>
99  *
100  * \code
101  * PARAMETERS IN THE ".INI"-LIKE CONFIGURATION STRINGS:
102  * -------------------------------------------------------
103  * [supplied_section_name]
104  * sensorLabel = RGBD360 // A text description
105  * preview_window = false // Show a window with a preview of the grabbed data in real-time
106  *
107  * device_number = 0 // Device index to open (0:first Kinect, 1:second Kinect,...)
108  *
109  * grab_image = true // Grab the RGB image channel? (Default=true)
110  * grab_depth = true // Grab the depth channel? (Default=true)
111  * grab_3D_points = true // Grab the 3D point cloud? (Default=true) If disabled, points can be generated later on.
112  *
113  * video_channel = VIDEO_CHANNEL_RGB // Optional. Can be: VIDEO_CHANNEL_RGB (default) or VIDEO_CHANNEL_IR
114  *
115  * pose_x=0 // Camera position in the robot (meters)
116  * pose_y=0
117  * pose_z=0
118  * pose_yaw=0 // Angles in degrees
119  * pose_pitch=0
120  * pose_roll=0
121  *
122  *
123  * // Left/Depth camera
124  * [supplied_section_name_LEFT]
125  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this section (it is not a separate device!)
126  *
127  * resolution = [640 488]
128  * cx = 314.649173
129  * cy = 240.160459
130  * fx = 572.882768
131  * fy = 542.739980
132  * dist = [-4.747169e-03 -4.357976e-03 0.000000e+00 0.000000e+00 0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
133  *
134  * // Right/RGB camera
135  * [supplied_section_name_RIGHT]
136  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this section (it is not a separate device!)
137  *
138  * resolution = [640 480]
139  * cx = 322.515987
140  * cy = 259.055966
141  * fx = 521.179233
142  * fy = 493.033034
143  * dist = [5.858325e-02 3.856792e-02 0.000000e+00 0.000000e+00 0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
144  *
145  * // Relative pose of the right camera wrt to the left camera:
146  * // This assumes that both camera frames are such that +Z points
147  * // forwards, and +X and +Y to the right and downwards.
148  * // For the actual coordinates employed in 3D observations, see figure in mrpt::obs::CObservation3DRangeScan
149  * [supplied_section_name_LEFT2RIGHT_POSE]
150  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this section (it is not a separate device!)
151  *
152  * pose_quaternion = [0.025575 -0.000609 -0.001462 0.999987 0.002038 0.004335 -0.001693]
153  *
154  * \endcode
155  *
156  * More references to read:
157  * - http://RGBD360
158  * - http://http://www.openni.org/
159  * \ingroup mrpt_hwdrivers_grp
160  */
162  {
164 
165  public:
166 
167  COpenNI2_RGBD360(); //!< Default ctor
168  ~COpenNI2_RGBD360(); //!< Default ctor
169 
170  /** Initializes the 3D camera - should be invoked after calling loadConfig() or setting the different parameters with the set*() methods.
171  * \exception This method must throw an exception with a descriptive message if some critical error is found.
172  */
173  virtual void initialize();
174 
175  /** To be called at a high rate (>XX Hz), this method populates the internal buffer of received observations.
176  * This method is mainly intended for usage within rawlog-grabber or similar programs.
177  * For an alternative, see getNextObservation()
178  * \exception This method must throw an exception with a descriptive message if some critical error is found.
179  * \sa getNextObservation
180  */
181  virtual void doProcess();
182 
183  /** The main data retrieving function, to be called after calling loadConfig() and initialize().
184  * \param out_obs The output retrieved observation (only if there_is_obs=true).
185  * \param there_is_obs If set to false, there was no new observation.
186  * \param hardware_error True on hardware/comms error.
187  *
188  * \sa doProcess
189  */
190  void getNextObservation(
192  bool &there_is_obs,
193  bool &hardware_error );
194 
195  /** Set the path where to save off-rawlog image files (this class DOES take into account this path).
196  * An empty string (the default value at construction) means to save images embedded in the rawlog, instead of on separate files.
197  * \exception std::exception If the directory doesn't exists and cannot be created.
198  */
199  virtual void setPathForExternalImages( const std::string &directory );
200 
201 
202  /** @name Sensor parameters (alternative to \a loadConfig ) and manual control
203  @{ */
204 
205  /** Get the maximum range (meters) that can be read in the observation field "rangeImage" */
206  inline double getMaxRange() const { return m_maxRange; }
207 
208  /** Enable/disable the grabbing of the RGB channel */
209  inline void enableGrabRGB(bool enable=true) { m_grab_rgb=enable; }
210  inline bool isGrabRGBEnabled() const { return m_grab_rgb; }
211 
212  /** Enable/disable the grabbing of the depth channel */
213  inline void enableGrabDepth(bool enable=true) { m_grab_depth=enable; }
214  inline bool isGrabDepthEnabled() const { return m_grab_depth; }
215 
216  /** Enable/disable the grabbing of the 3D point clouds */
217  inline void enableGrab3DPoints(bool enable=true) { m_grab_3D_points=enable; }
218  inline bool isGrab3DPointsEnabled() const { return m_grab_3D_points; }
219 
220  /** @} */
221 
222  protected:
223 
224  virtual void loadConfig_sensorSpecific(
225  const mrpt::utils::CConfigFileBase &configSource,
226  const std::string &section );
227 
229 
230  static const int NUM_SENSORS = 2;
231 
232  bool m_preview_window; //!< Show preview window while grabbing
233  size_t m_preview_window_decimation; //!< If preview is enabled, only show 1 out of N images.
234  size_t m_preview_decim_counter_range, m_preview_decim_counter_rgb;
235  mrpt::gui::CDisplayWindowPtr m_win_range[NUM_SENSORS], m_win_int[NUM_SENSORS];
236 
237  double m_maxRange; //!< Sensor max range (meters)
238 
239  bool m_grab_rgb, m_grab_depth, m_grab_3D_points ; //!< Default: all true
240 
241  }; // End of class
242  } // End of NS
243 
244 } // End of NS
245 
246 
247 #endif
mrpt::hwdrivers::COpenNI2_RGBD360::enableGrabRGB
void enableGrabRGB(bool enable=true)
Enable/disable the grabbing of the RGB channel.
Definition: COpenNI2_RGBD360.h:209
mrpt::hwdrivers::COpenNI2_RGBD360::m_grab_rgb
bool m_grab_rgb
Definition: COpenNI2_RGBD360.h:239
COpenNI2Generic.h
mrpt::hwdrivers::COpenNI2_RGBD360::enableGrabDepth
void enableGrabDepth(bool enable=true)
Enable/disable the grabbing of the depth channel.
Definition: COpenNI2_RGBD360.h:213
mrpt::hwdrivers::COpenNI2_RGBD360::m_preview_window_decimation
size_t m_preview_window_decimation
If preview is enabled, only show 1 out of N images.
Definition: COpenNI2_RGBD360.h:233
mrpt::hwdrivers::COpenNI2Generic
An abstract class for accessing OpenNI2 compatible sensors.
Definition: COpenNI2Generic.h:28
mrpt::hwdrivers::COpenNI2_RGBD360::m_preview_decim_counter_rgb
size_t m_preview_decim_counter_rgb
Definition: COpenNI2_RGBD360.h:234
mrpt
This is the global namespace for all Mobile Robot Programming Toolkit (MRPT) libraries.
Definition: CParticleFilter.h:16
DEFINE_GENERIC_SENSOR
#define DEFINE_GENERIC_SENSOR(class_name)
This declaration must be inserted in all CGenericSensor classes definition, within the class declarat...
Definition: CGenericSensor.h:251
mrpt::hwdrivers::COpenNI2_RGBD360
A class for grabing RGBD images from several OpenNI2 sensors.
Definition: COpenNI2_RGBD360.h:161
mrpt::hwdrivers::COpenNI2_RGBD360::isGrabDepthEnabled
bool isGrabDepthEnabled() const
Definition: COpenNI2_RGBD360.h:214
CObservationRGBD360.h
mrpt::utils::CConfigFileBase
This class allows loading and storing values and vectors of different types from a configuration text...
Definition: CConfigFileBase.h:30
mrpt::poses::CPose3D
A class used to store a 3D pose (a 3D translation + a rotation in 3D).
Definition: CPose3D.h:72
CGenericSensor.h
TEnumType.h
mrpt::hwdrivers::COpenNI2_RGBD360::isGrabRGBEnabled
bool isGrabRGBEnabled() const
Definition: COpenNI2_RGBD360.h:210
mrpt::hwdrivers::COpenNI2_RGBD360::m_preview_window
bool m_preview_window
Show preview window while grabbing.
Definition: COpenNI2_RGBD360.h:232
mrpt::hwdrivers::CGenericSensor
A generic interface for a wide-variety of sensors designed to be used in the application RawLogGrabbe...
Definition: CGenericSensor.h:63
mrpt::hwdrivers::COpenNI2_RGBD360::getMaxRange
double getMaxRange() const
Get the maximum range (meters) that can be read in the observation field "rangeImage".
Definition: COpenNI2_RGBD360.h:206
mrpt::obs::CObservationRGBD360
Declares a class derived from "CObservation" that encapsules an omnidirectional RGBD measurement from...
Definition: obs/CObservationRGBD360.h:66
mrpt::hwdrivers::COpenNI2_RGBD360::enableGrab3DPoints
void enableGrab3DPoints(bool enable=true)
Enable/disable the grabbing of the 3D point clouds.
Definition: COpenNI2_RGBD360.h:217
mrpt::hwdrivers::COpenNI2_RGBD360::m_sensorPoseOnRobot
mrpt::poses::CPose3D m_sensorPoseOnRobot
Definition: COpenNI2_RGBD360.h:228
HWDRIVERS_IMPEXP
#define HWDRIVERS_IMPEXP
Definition: hwdrivers_impexp.h:82
CDisplayWindow.h
mrpt::hwdrivers::COpenNI2_RGBD360::m_maxRange
double m_maxRange
Sensor max range (meters)
Definition: COpenNI2_RGBD360.h:237
mrpt::hwdrivers::COpenNI2_RGBD360::isGrab3DPointsEnabled
bool isGrab3DPointsEnabled() const
Definition: COpenNI2_RGBD360.h:218



Page generated by Doxygen 1.8.16 for MRPT 1.4.0 SVN: at Mon Oct 14 23:08:25 UTC 2019