OPTI-6G: Neural Network-Powered Position Detection via Optical Beam Steering
This demonstration showcases the process of estimating the user’s equipment position. At the beginning, the robot moves the user’s equipment to a random location.
This position is displayed as a green dot in the figure on the right, which provides a top-down view of the system. Next, the LED transmitter begins to modulate its orientation, driven by a motorised mechanism, emulating a beam-steering system. You can observe this in the upper-left part of the video. As the LED reaches predefined orientations, the received power at the photodiode is measured using a digital multimeter and recorded in the program.
Once the data from each orientation is collected, it is fed as input into a pre-trained deep learning neural network, which analyses it to estimate the equipment’s position. The estimated position is marked in red, with a line illustrating the deviation from the actual position.
Finally, as a demonstration of a potential real-world application, the LED transmitter adjusts its orientation to point directly at the receiver based on the estimated position.
Starring: Kevin Jose Acuna Condori, PhD. student at Université Paris-Saclay & OPTI-6G researcher





