3,072
edits
Efernandez (talk | contribs) No edit summary |
Efernandez (talk | contribs) No edit summary |
||
Line 11: | Line 11: | ||
Before running the demo, make sure your board is properly set up by following the [[Metropolis Microservices/Getting Started/Get Started|Get Started]] guide. | Before running the demo, make sure your board is properly set up by following the [[Metropolis Microservices/Getting Started/Get Started|Get Started]] guide. | ||
}} | }} | ||
== Starting the Demo Application == | |||
{{message|See [https://developer.ridgerun.com/wiki/index.php/Metropolis_Microservices/People_Engagement_Demo/Getting_the_code Getting the Code] for instructions on how to get an evaluation version of the demo application}} | |||
'''1. Configure the camera service''': the configuration for the '''camera''' service is under 'config/camera''' in the demo folder. Read the [https://developer.ridgerun.com/wiki/index.php/Metropolis_Microservices/RidgeRun_Services/Camera Camera Service] documentation for configuration options. | |||
'''2. Configure the media service:''' the configuration for the '''media''' service is under 'config/media in the demo folder. Read the [https://developer.ridgerun.com/wiki/index.php/Metropolis_Microservices/RidgeRun_Services/media Media Service] documentation for configuration options. Also, make sure the calibration.json file in the same directory is updated for your cameras. | |||
'''3. Configure the Engagement Analytics Service:''' the configuration for the '''engagement analytics''' service is under 'config/engagement-analytics''' in the demo folder. Read the [https://developer.ridgerun.com/wiki/index.php/Metropolis_Microservices/RidgeRun_Services/Engagement Analytics Engagement Analytics Service] documentation for configuration options. | |||
'''4. Configure the Display Service:''' for the Display service, make sure the '''DISPLAY''' variable in the '''.env''' file located in the root of the demo has the proper value for your board. To double check you can run the following command from a terminal in your board: | |||
<syntaxhighlight lang="bash"> | |||
echo $DISPLAY | |||
</syntaxhighlight> | |||
'''5. Start the application:''' Once everything is properly configured, run the following command form the demo folder to start the application: | |||
<syntaxhighlight lang="bash"> | |||
docker compose up -d | |||
</syntaxhighlight> | |||
This will launch and configure all the application services. | |||
{{message| | |||
You can check the demo is up by running '''docker ps'''. You should see the camera, media, analytics, display, Grafana, Influx, and Redis services listed. | |||
The first time you start the demo application, the '''media-service''' will take several minutes to initialize due to the models' initialization. | |||
}} | |||
== Demo Usage == | |||
If you followed the previous steps to start the application, you should have a stream available at '''rtsp://BOARD_IP:5021/ptz_out'''. You can open it from the host computer with the following command: | |||
<syntaxhighlight lang="bash"> | |||
vlc rtsp://BOARD_IP:5021/ptz_out | |||
</syntaxhighlight> | |||
Just replace '''BOARD_IP''' with the actual IP address of the board running the demo. | |||
The demo is controlled through the AI-Agent which comes with a web interface that can be accessed at '''BOART_IP:30080/agent'''. | |||
The page looks as follows: | |||
[[File:Ai agent page.png|800px|thumb|center]] | |||
Through that interface, the application can be controlled using natural language, the 2 available commands are: move camera and find objects. | |||
=== Move Camera === | |||
The available options are: | |||
* '''move the camera X degrees left''': This will move the camera X degrees specified to the left. | |||
* '''move the camera X degrees right''': This will move the camera X degrees specified to the right. | |||
=== Find Objects === | |||
With this feature, you can indicate the application to look for any object on the input stream and 2 actions will be performed. | |||
'''1.''' The camera will point to that object once it is found. | |||
'''2.''' A clip will be recorded of that event (disabled by default). | |||
{{Colored box|background-content-color=#EDF1F7|background-title-color=#6586B9|title-color=#FFFFFF|title='''Note'''|icon=notice-icon-white.png | |||
|style=overflow:hidden; | |||
|content= | |||
You can start by typing "Find a dog". If there are any dogs in the scene, the camera will point to it. | |||
}} | |||
{{Colored box|background-content-color=#EDF1F7|background-title-color=#6586B9|title-color=#FFFFFF|title='''Note'''|icon=notice-icon-white.png | |||
|style=overflow:hidden; | |||
|content= | |||
Both the camera movement and clip recording can be disabled via analytics-service configuration or API. Take a look at [[Metropolis Microservices/RidgeRun Services/Analytics|Analytics Service]] for more information. | |||
}} | |||
=== Demo in action === | |||
The following video shows how to start and run the demo. | |||
<center> | |||
<embedvideo service="youtube">https://www.youtube.com/watch?v=Z-w41ZBTr50</embedvideo> | |||
</center> | |||