Skip to content

Commit

Permalink
Revamp for 2023 Deployment
Browse files Browse the repository at this point in the history
- Began with code from the `2023PreDeployment` branch
- Split each component into a separate file
- Added thorough in-code documentation
- Adding eslint and prettier checks for readability and consistency
- Implemented intentional splits between local and global state
  • Loading branch information
amalnanavati committed Apr 14, 2023
1 parent 257c488 commit c8d8545
Show file tree
Hide file tree
Showing 457 changed files with 70,678 additions and 92,801 deletions.
31 changes: 2 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,3 @@
# feeding_web_interface
# Feeding Web Interface

1. ssh into nano to start the camera node, make sure you export ROS_MASTER_URI to LOVELACE

```ssh [email protected]``` passcode is normal lab passcode, with the last 4 characters changed to N@n0

```export ROS_MASTER_URI=https://192.168.2.145:11311```

```./run_camera.sh```

```uselovelace```

* **Make sure all the following commands must be executed in the src/feeding_web_interface/frontend folder**

2. go to ws/src/feeding_web_interface/frontend

```./ngrok start --all``` to start proxy server
3. launch rosbridge_server to create localhost with the following command

```roslaunch rosbridge_server rosbridge_websocket.launch```

4. enter the following command:

```python -m SimpleHTTPServer 8082```

5. start web video server to help streaming camera node:

```rosrun web_video_server web_video_server```

6. go to webpage http://ada_feeding.ngrok.io/ to see the demo web interface. If you follow the above steps carefully, you should now be able to see the camera stream displayed on the right and the foodImage on left.
This repository contains code for the feeding web app. The app itself is in `feedingwebapp` and contains its own README. The other folder in this repository (will) contain a ROS package that can be used to test the web app's integration with ROS.
23 changes: 23 additions & 0 deletions feedingwebapp/.eslintignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

# dependencies
/node_modules
/.pnp
.pnp.js

# testing
/coverage

# production
/build

# misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local

npm-debug.log*
yarn-debug.log*
yarn-error.log*
30 changes: 30 additions & 0 deletions feedingwebapp/.eslintrc.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
module.exports = {
root: true,
parserOptions: {
ecmaVersion: 2020,
sourceType: 'module',
ecmaFeatures: {
jsx: true
}
},
settings: {
react: {
version: 'detect'
}
},
env: {
jest: true,
browser: true,
amd: true,
node: true
},
extends: [
'eslint:recommended',
'plugin:react/recommended',
'plugin:prettier/recommended' // Make this the last element so prettier config overrides other formatting rules
],
rules: {
'no-unused-vars': ['error', { vars: 'all', args: 'after-used', ignoreRestSiblings: false }],
'prettier/prettier': ['error', {}, { usePrettierrc: true }]
}
}
23 changes: 23 additions & 0 deletions feedingwebapp/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

# dependencies
/node_modules
/.pnp
.pnp.js

# testing
/coverage

# production
/build

# misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local

npm-debug.log*
yarn-debug.log*
yarn-error.log*
20 changes: 20 additions & 0 deletions feedingwebapp/.prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
{
"arrowParens": "always",
"bracketSpacing": true,
"embeddedLanguageFormatting": "auto",
"htmlWhitespaceSensitivity": "css",
"insertPragma": false,
"jsxBracketSameLine": false,
"jsxSingleQuote": true,
"proseWrap": "preserve",
"quoteProps": "as-needed",
"requirePragma": false,
"semi": false,
"singleQuote": true,
"trailingComma": "none",
"useTabs": false,
"vueIndentScriptAndStyle": false,
"printWidth": 140,
"tabWidth": 2,
"rangeStart": 0
}
89 changes: 89 additions & 0 deletions feedingwebapp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# Feeding Web Interface
> For Technical Documentation, please refer to [this file](https://github.com/personalrobotics/feeding_web_interface/blob/2023PreDeployment/feedingwebapp/TechDocumentation.md).
## Summary
This project aims to develop a web app to connect and control feeding through ADA robot. The overall workflow (state machine) for this robot can be seen below.

<!-- ![Web App State Machine](https://user-images.githubusercontent.com/8277986/191333326-c71a1765-475c-40f6-87da-a79b7c73e0ee.png) -->
![newWebAppWorkflow](https://user-images.githubusercontent.com/26337328/223597500-5e520b7a-eb2b-45ad-b9e8-91fec1bdeba4.jpg)

## Dependencies
- [`npm`](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
- [ROS](http://wiki.ros.org/noetic/Installation)

## Misc Notes
Q: Why are we using a global state to manage which page of the app the user is on, as opposed to React's Router?
A: If we use React's Router, we allow users to navigate through the web app by changing the URL, or clicking the "back" button on their browser. However, not all pages should be reachable from all other pages, e.g., the bite selection page should only be reachable if the robot arm is above the plate. Hence, we use a global state so we can internally control the user's flow through the app.

TODO (amaln):
- Document debug mode.
- We use npm, not yarn, as a package installer for this repo
- Add guidelines for contributing

## Style Guide
- For writing your code, follow the [AirBnB React/JSX Style Guide](https://airbnb.io/javascript/react/). For anything not specified there (e.g., variable naming conventions), use the [AirBnB JavaScript Style Guide](https://airbnb.io/javascript/).
- The style guide was written before hooks (e.g, `useState`) were added to React. Since hooks can only be used within the Component's function, ordering code within that function becomes important. See [this for an example of how to order calls to various hooks](https://dev.to/abrahamlawson/react-style-guide-24pp#comment-1f4fd).
- For documenting code, follow the [React Styleguidist guide](https://react-styleguidist.js.org/docs/documenting/).

## TODO
- Consider having a generic "Teleoperation" page that would be accessible from anywhere. The page would show the robot's live video and buttons for the user to teleoperate the robot. This is useful e.g., if the user wants to nudge the robot closer to their face, move the robot when it is above the plate so it doesn't obstruct their desired food, etc. In fact, maybe this should take the place of the "Video" modal, that the user can click anytime. If the robot is currently executing a plan the teleop buttons would be greyed out (but the can still see the video), else they can see the video and teleoperate the robot.
- An additional axis of customization is to let the user specify the staging location, where one option should be "above the plate" e.g., don't stage it in front of their mouth until the user initiates the bite (with a method other than open mouth).

## Usage
### How to run the app locally?
- Clone the repo: `git clone [email protected]:personalrobotics/feeding_web_interface.git` using SSH, or `git clone https://github.com/personalrobotics/feeding_web_interface.git` using HTTP
- See all branches: `git branch -a` and check to make sure `2022_revamp` branch shows up
- Then checkout `2022_revamp` branch: `git checkout 2022_revamp`
- Run `cd ./feedingwebapp`
- Perform `npm install` to install all the packages related to this project
- Then, `npm start` to begin the application.
- Then, use a web browser to navigate to `localhost:3000` to see the application.

In the `Home.js` file, you can set `debug = true` and run the application in debug mode to experience the GUI of the app without needing it to connect to ROS. Otherwise, set it to `debug = false` and run it along with roscore/ros messages.

#### How to run ROS 'stuff' with the app in `debug = true` mode?
For this, you just need the web app running. And, to mimic the robot/webapp communicating, there are some default buttons throughtout the app that can help mimic the state changes as described in the state machine picture above.

#### How to run ROS 'stuff' with the app in `debug = false` mode?
- `cd` into the ROS workspace
- Make sure to open up at least four terminals with all of them inside the workspace.
- Then, run `source devel/setup.bash` in each of the terminals that are open (Use Tmux to make splitting terminal easier on Ubuntu)
- On the first terminal, run `roscore`
- On the second terminal, run `roslaunch rosbridge_server rosbridge_websocket.launch` to start the webserver. _This will allow the webapp to actually directly connect with the server (when you hit "start feeding" button)._
- On the third terminal (designate this terminal for yourself to be the "ROBOT sending messages to app") - You can publish messages to `/from_robot` topic and communicate with the webapp to make changes to the states. An example message is `rostopic pub /from_robot std_msgs/String "<state>"`.
- On the fourth terminal (designate this terminal for yourself to be the "ROBOT receiving messages from app") - You can listen to messages from `from_web` topic. An example command is `rostopic echo /from_web` to listen to messages published from the webapp.

#### How to run camera 'stuff' with the app?
The following steps will outline the method to use to run the video collected from a particular topic in the [rosbag](http://wiki.ros.org/rosbag) _These steps might be different for if you were to connect to Robot's camera (please check)._
- Start by downloading the rosbags that you wish to run and store them in your ROS workspace.
- Make sure to run `source devel/setup.bash`.
- Then make sure to download the [web video sever](http://wiki.ros.org/web_video_server) files by performing the following steps:
- First, split the terminal and open up two terminals both inside the `camera_ws`. Make sure you perform `source devel/setup.bash`.
- In the first terminal, perform the following:
```
git clone https://github.com/sfalexrog/async_web_server_cpp.git
cd async_web_server
git checkout noetic-devel
```
- In the second terminal, perform the following:
```
git clone https://github.com/RobotWebTools/web_video_server.git
```
Then, perform `catkin build` from one of the terminals.

##### Running Rosbags
- First navigate into your `catkin_ws` workspace and make sure you have your original rosbag files. This means, if you need to decompress, make sure you decompress and store the decompressed rosbag file. Usually, after decompressing, the rosbag file gets stored as <something>.orig.bag.
- Next, run `rosbag play <name of the bag file>`.
- You should see the rosbag running its contents.
- While this is happening, we will need to start the web server to be able to see the images as video. So, we would navigate into the `camera_ws` workspace and change directory into where ever you have stored the `web_video_server`.
- Inside this folder, run `rosrun web_video_server web_video_server`.
- After this, you can go to a web browser and type in `localhost:8080` and see that there is a list of topics to choose from. Choose a particular topic to listen to and you should see the video playing. As you do that, you can notice the parameters in web url changing. Make sure you play with the parameters by referring to the document below to get the best quality video for the webapp.

For further information about this, you can refer to [Web Video server](http://wiki.ros.org/web_video_server) document.

## What are some next steps?
- Enabling a method of selecting food from the live video feed that gets displayed in the video tab.
- What happens if Wifi goes out?
- What happens if the user accidentally refreshes?
- When you first start the webapp, it should first get the status from the robot and update itself to mimic that of the robot. This could be something that can be implemented to eliminate any syncing issues between the robot and the app. Further, in the Settings page, the app should get the settings as default from the robot.
- Currently, the E-stop is not accessible when the video modal is open. Consider changing this.
44 changes: 44 additions & 0 deletions feedingwebapp/TechDocumentation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Technical Documentation

## Communication between Robot and Webapp
Currently, the codebase is using ROS topics with `from_robot` topic for messages from the Robot and `from_web` topic for messages from the webapp to the robot. This is not the best way of fostering this communication. It would be ideal if we can shift from ROS topics to ROS services.

## States
In [this page](https://github.com/personalrobotics/feeding_web_interface/blob/2022_revamp/feedingwebapp/src/Pages/Constants.js), there are all the states as constants outlined. Each button click is simply calling a function, which eventually calls the `changeState()` function. The `changeState` function takes in a `String` parameter. And this parameter has to be one of the constant values specified in the constants file. The states determines what pages are displayed. For instance, if the app is currently in `Not_Eating` state, then it would be in the first page that gets displayed. But, once the `start feeding` button is clicked, the app progresses to `moving_above_the_plate` state and transitions the app into that state.

There is essentially a large `if/else` block, which is constantly checking for state changes. As the state changes, a different element gets rendered. Below is a bit of code showing that happening:
```
else if (currentStateVal.feeding_status == constants.States[2]) {
return (
<div style={{ "overflow-x": "hidden", "overflow-y": "auto" }} className="outer">
<h1 className="text-center txt-huge" style={{ "font-size": "40px" }}>Food Item Selection</h1>
{isConnected ? <div style={{ "display": "block" }}><p class="connectedDiv" style={{ "font-size": "24px" }}>🔌 connected</p></div> : <div style={{ "display": "block" }}><p class="notConnectedDiv" style={{ "font-size": "24px" }}>⛔ not connected</p></div>}
<div style={{ "display": "block" }}><Button className="doneBut" style={{ "font-size": "24px", "margin-top": "0px", "marginRight": "10px", "marginLeft": "auto", "display": "block" }} onClick={() => changeState(constants.States[9])}>✅ Done Eating</Button></div>
<p class="transmessage" style={{ "margin-bottom": "0px" }}>Choose from one of the following food items.</p>
<Row xs={3} s={2} md={3} lg={4} className="justify-content-center mx-auto my-2" style={{ paddingBottom: '35vh' }}>
{food.map((value, i) => (
<Button key={i} variant="primary" className="mx-1 mb-1" style={{ paddingLeft: "0px", paddingRight: "0px", marginLeft: "0px", marginRight: "0px", "font-size": "25px" }} value={value} size="lg" onClick={(e) => food_item_clicked(e)}>{value}</Button>
))
}
</Row>
<Footer />
</div>
);
}
```

#### Note about app states vs. Robot states
App States:
```
const [message, setMessage] = useState("");
```
This, for instance, is controlling the state of the app. This is internal o the `Home.js` file and would not be accessible to things outside that. In this case, we are using to publish messages we want the user to see.

Robot States:
```
const currentStateVal = useStore((state) => state.defaultState);
```
We are using a `useStore` to control the change of states on the app based on changes in states in the robot. So, in the above piece of code, `currentStateVal.feeding_status` is checking for the robot's state as seen by the app. If the value in that is particularly equal to a certain state constant, then we want to execute the code block associated with that.
Loading

0 comments on commit c8d8545

Please sign in to comment.