-
Notifications
You must be signed in to change notification settings - Fork 3
Customer Milestone Report 3 ‐ Project Deliverable
This is the third milestone report of the Group 7 of bounswe2024. This group of CMPE352 Introduction to Software Engineering Course intends to build a software called Artifact which semantically browses for any painting existing in exhibitions or museums as well as enabling users to create their own paintings and interact with each other with their art talent. The contributed members of this project are:
- Abdulsamet Alan
- Deniz Bilge Akkoç
- Mert Cengiz
- Asım Dağ
- Oğuz Hekim
- Eren Pakelgil
- Mustafa Ocak
- Hanaa Zaqout
- Dağlar Eren Tekşen
Artifact is a social media platform that will make finding the paintings and sharing art works with others easier. It’s main domain is paintings and the project not only uses Wikidata as main data source but also gives users a platform to share their paintings with each other. So main objective is connecting everyone with every painting.
Currently, one is able to register with a username, e-mail, and a password (These e-mails are not neccessarily be valid, and password is not checked for strength right now). Similarly, a user that has already registered is able to log in with his/her username and password. Regardless of being registered or not, one can make searches related to the title, creator, genre, and art movement fields. The results are related to what is written in the search bar. A registered user can create a post with the search results. It is not possible to upload an image and create a post with that image, or create a post without image. One can like or comment on a post, but cannot open any bookmark collection. Users have a profile page, but there is no administrator page that can perform the features of an administrator. The performance of a system is not optimized as described in requirements.
This is login page. User can login with username and password
This is the feed that shows the posts of followed people
This is the page to create posts. Users can create posts by title, description, image url and label.
This is user profile. User can see their posts.
Users can search for a painting
This is login page. User can login with username and password Users can comment on a painting
And these comments will be displayed under the post.
Users can like a post.
Our Github code repository could be achieved via this link
We have completed the deliverables for the Customer Milestone 3. The release tag for the application is can be seen from the link: Group7-Practice-App-Release-v0.2
We have deployed the backend and frontend applications to Google Cloud Platform service Cloud Run
. The required commands can be seen in the readme.md
files in frontend and backend folders. Cloud Build simply builds the images using the Dockerfile
included in the folders.
The deployment URLs are:
Artifact Backend Application - https://artifactbackend-yslcfqdwna-oa.a.run.app/
Artifact Frontend Application - https://frontend-yslcfqdwna-oa.a.run.app/
We used Google Cloud CLI to deploy our applications to Cloud Run. We couldn't create a CI/CD pipeline for the autonomous build, test and the deployment of the application because of the insufficient permissions.
The link of the APK file for mobile application could be achieved via this link:
OpenAPI 3 documentation is accessible here as a yaml file. You can view the documentation in a nice UI by opening the file in this website.
The final version of our SRS can be achieved via here
- Class Diagrams: Class Diagrams
- Use-Case Diagrams: Use-Case Diagrams
- Sequence Diagrams: Sequence Diagrams
Meeting Number | Link | Description |
---|---|---|
Meeting 13 | Meeting Notes - 13 | The priorities are determined. Remaining work is distributed among the members. |
Our application contains three folders: artifact_backend, artifact_frontend and artifact_mobileApp.
To build and run the whole app locally, follow these steps in the project root folder:
Prerequisites
- Docker
- docker-compose
Build the Docker Images
docker-compose build --no-cache
Run the Container
docker-compose up
This will initialize the backend app in localhost:8080
, the frontend app in localhost:3000
and the mobile app in localhost:8081
. Before building and starting the app, don't forget to create an .env file by filling the fields in .env.example accordingly.
- You can find the deployment steps for frontend app here
- You can find the deployment steps for backend app here
In addition to the users below, one is able to sign up and use the information when logging in the application also.
- Username: Daglar
- Password: 12345
- Username: EmilyArtista
- Password: 12345
- Username: ArtfulSoul22
- Password: 12345
- Username: VibrantBrush
- Password: 12345
- Username: ArtisticJennifer
- Password: 12345
- Username: CreativeJohnDoe
- Password: 12345
- Username: InkSplashMike
- Password: 12345
The lack of time to add many features to our application was the only challenge for our group. After the Second Customer Milestone, we had to parse the search results on front-end and mobile parts, while adding the creating post feature with likes and comments and improving our semantic search feature with new end-points. Luckily, we were able to distribute our contributing members so that the remaining work reduced so fast. On the other hand, lack of knowledge about unit testing challenged as there were limited time to learn it.
We have learned that unit testing during implementation phase and before sending every pull request is essential to control the functionality of the application and makes easier to track the progress. In future, we are going to make sure that every pull request is unit tested. Additionally, it is learned that obeying plans strictly is essential for continous development, and our intention is to obey what we have planned so far in upcoming projects.
- Improve Semantic Search Feature
- Creating a Feedpage
- Render Search Results In PostCard For Mobile App
1.Issue#106 access profile page via feed page 2.Issue#104 render search results in postcard for mobile app 3.Issue#75 improve semantic search feature #89
Personal Wiki Page is here
I helped sql commands during semantic search part and changed them according to our purpose.
This unit test is for the first semantic search we implemented that checks for query result than checks semantic search returns of that query result.
import unittest
from unittest.mock import MagicMock
from your_module_name import related_search, get_painting_sparql, get_movement_sparql, get_genre_sparql, generate_combinations, find_paintings
class TestArtQueries(unittest.TestCase):
def test_generate_combinations(self):
words = ["word", "test"]
combinations = generate_combinations(words)
expected_combinations = ["word test", "word Test", "Word test", "Word Test"]
self.assertEqual(combinations, expected_combinations)
def test_get_painting_sparql(self):
# You may mock the SPARQLWrapper class and its methods to isolate the unit test.
# Replace `your_module_name` with the actual module name
with patch('your_module_name.SPARQLWrapper') as mock_sparql:
mock_query = MagicMock()
mock_query.queryAndConvert.return_value = {"results": {"bindings": [{"itemLabel": {"value": "Test Painting"}}]}}
mock_sparql.return_value = mock_query
results = get_painting_sparql("test")
self.assertEqual(results[0]["itemLabel"]["value"], "Test Painting")
# Add similar tests for get_movement_sparql, get_genre_sparql, and find_paintings
def test_related_search(self):
# Mock the functions get_painting_sparql, get_movement_sparql, and get_genre_sparql
with patch('your_module_name.get_painting_sparql') as mock_painting,\
patch('your_module_name.get_movement_sparql') as mock_movement,\
patch('your_module_name.get_genre_sparql') as mock_genre:
mock_painting.return_value = [{"creator": {"value": "Q123"}}]
mock_movement.return_value = [{"creator": {"value": "Q456"}}]
mock_genre.return_value = [{"creator": {"value": "Q789"}}]
results = related_search("test")
self.assertEqual(len(results), 3) # Assuming each function returns one result for this test case
if __name__ == '__main__':
unittest.main()
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
I didnt make any contribution in this part.
We couldnt put mobile feed part to the under of the bottom navigation tab no matter how much we tried so added it to the upper part of mobile and pinned it there. We were searching every combination of a query such as mona lisa and mOnA liSa used to return same result so I implemented a search filter just to change first letter of each word to get faster results
- Create Post Functionality for Web Application
- Add Like, Unlike and Comment on a Post Functionalities
- Migrate Cloud SQL Database and Deploy the Application
Personal Wiki Page is here
I have completed the frontend application by myself so didn't have a chance to work on the API integrations.
Didn't write any unit tests in order to make the deployments keep up with the demo.
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
I have deployed the Backend and Frontend to the Google Cloud Platform. Created the Cloud MySQL Database and connected it to our backend (Made updates on both backend and cloud). Improved backend response objects for almost every endpoint with Oğuz Hekim in order to make the integration with Frontend application easier. I have created a smooth and seamless data retrieval and query invalidation on the frontend. Any request sent to backend affects the frontend immediately. Login and Signup features are working without an error. The system notifications to the users are working in almost every backend request.
I also implemented post data retrieval, post creation and improved the context management of our Mobile Application with Hanaa Zaqout.
I have faced a great challenge while trying to deploy the application to the cloud. Our backend and frontend were not ready to deploy to the cloud so I made sure that they are and also our database was behind from the current local database. I have connected to the cloud database via SSH-Tunnel and run the commands to migrate the database manually.
Also, creating the context and query client was hard because it requires so much starter code to make everything work. I haven't sleep the night before the demo to be sure that everything works fine,
- Improve Semantic Search Feature
- Make API Calls From Mobile Application
- Access Profile Page via Feed Page
- Improve Semantic Search Feature
- Mobile with API Call and Dockerized
- Access Profile Page via Feed Page
Wiki documentation of Mert Cengiz is available here
Collaborating with Eren Pakelgil, we have integrated the third party Wikidata API, which is the basis of the whole search feature in Artifact. Any user makes a search via providing a text, and this API searches in Wikidata in context of image URL, creator, art movement, and genre with the help of SPARQL, using Python as an endpoint for that query language. The call transcript of this third-party API is the text that is provided to search, and the response is the search result cards that are related paintings of that text.
At first, the database connection is checked whether we are able to connect to our database from the backend, or not. If there is a connection, the test becomes successful; otherwise, it fails.
def test_database_connection(self):
self.assertTrue(connection.connection is not None)
As the second test, all models are tried to set into dummy data. If any model in our database cannot create a database table with given data , this test directly fails. In real case, this type of data will occur, and we want to be sure that our tables are working.
def setUp(self):
self.image = Image.objects.create(url = "https://upload.wikimedia.org/wikipedia/commons/e/ec/Mona_Lisa%2C_by_Leonardo_da_Vinci%2C_from_C2RMF_retouched.jpg")
self.profile = Profile.objects.create(username = "db_akkoc", bio = None, profile_picture = None, followers = None)
self.post = Post.objects.create(profile = self.profile, title = "The Starry Night", content = "What a wonderful image!", image = self.image)
self.collection = Collection.objects.create(name = "Cubist Paintings", profile = self.profile, posts = self.post)
self.like = Like.objects.create(post = self.post, profile = self.profile)
self.comment = Comment.objects.create(post = self.post, profile = self.profile, content = "I love Cubism!")
self.label = Label.objects.create(name = "Cubist Oil", type = "system", material = "oil paint", genre = "Cubism", is_own_artwork = False)
Finally, we are testing whether the attributes are set correctly in database tables. It is done by checking every single attribute of each table one-by-one. If any data is not equal to what initially set already, the related test directly fails. Otherwise, the test becomes successful.
def test_image(self):
image = Image.objects.get(url="https://upload.wikimedia.org/wikipedia/commons/e/ec/Mona_Lisa%2C_by_Leonardo_da_Vinci%2C_from_C2RMF_retouched.jpg")
self.assertEqual(image.url, "https://upload.wikimedia.org/wikipedia/commons/e/ec/Mona_Lisa%2C_by_Leonardo_da_Vinci%2C_from_C2RMF_retouched.jpg")
def test_profile(self):
profile = Profile.objects.get(username="db_akkoc")
self.assertEqual(profile.bio, None)
def test_post(self):
post = Post.objects.get(profile=Profile.objects.get(username="db_akkoc"))
self.assertEqual(post.title, "The Starry Night")
self.assertEqual(post.content, "What a wonderful image!")
self.assertEqual(post.image, Image.objects.get(url="https://upload.wikimedia.org/wikipedia/commons/e/ec/Mona_Lisa%2C_by_Leonardo_da_Vinci%2C_from_C2RMF_retouched.jpg"))
def test_collection(self):
collection = Collection.objects.get(name="Cubist Paintings")
self.assertEqual(collection.profile, Profile.objects.get(username="db_akkoc"))
self.assertEqual(collection.posts, Post.objects.get(profile=Profile.objects.get(username="db_akkoc")))
def test_like(self):
like = Like.objects.get(post=Post.objects.get(profile=Profile.objects.get(username="db_akkoc")))
self.assertEqual(like.profile, Profile.objects.get(username="db_akkoc"))
def test_comment(self):
comment = Comment.objects.get(post=Post.objects.get(profile=Profile.objects.get(username="db_akkoc")))
self.assertEqual(comment.profile, Profile.objects.get(username="db_akkoc"))
self.assertEqual(comment.content, "I love Cubism!")
def test_label(self):
label = Label.objects.get(name="Cubist Oil")
self.assertEqual(label.type, "system")
self.assertEqual(label.material, "oil paint")
self.assertEqual(label.genre, "Cubism")
self.assertEqual(label.is_own_artwork, False)
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
The Dockerfile of the mobile application is introduced via setting the working directory in the container, copying the current directory contents into the container at /usr/src/app
, installing any needed packages specified in package.json
, exposing the port on 8081, and providing starting commands. Similarly, the mobile application is containerised, and hence integrated with the other parts of the project by adding building directory, volumes, ports, environment, and .env
file into docker-compose.yml
file. After these updates, the mobile application is able to request and receive information from database, and use real data instead of mock ones while enjoying the benefits of Dockerisation.
While collaborating with Eren Pakelgil in improving semantic search feature, even though I have enabled to reach the results with their labels instead of their QID's with introducing the labeling service in SPARQL, handling properties starting with both lowercase and capital letters was a challenge for us. Eren has suggested to use REGEX in our semantic search part, at first it seems to be so slow, but then improved it by asynchronus functions and trying only the first letters with two cases.
- Implementing new search method
- Improve search time delay using asynchronous functions
- Discussing About the Order of Priority Among Tasks
All the functionalities for semantic searching part are in this PR For this pr, we had a meeting with Mustafa Ocak around 6-7 hours straight. The commits, development and every discussion about what could be better has made in this pr and thats why I couldn't seperate my commits as I intended. I'm aware that this is not the best practice but after such a hard work, we wanted to finish everything in time and make it work as soon as possible.
This is the link to my wikipage.
I implemented endpoint of the search funtion with Mustafa. In the initial approach, we used sparql's api and implemented a direct approach like searching for all possible matchings. But the frontend part was taking so much time. So we had to make it work faster. In order to do this, I implemented asynchronous functionality. The search function first returns the paintings asynchronously and after that, searchs for others to return the related results. This gives the frontend enough time to render the initial results and then results keeps coming after rendering. In order to implement such asynchronous functions, I used 2 libraries in python:
- asyncio
- aiohttp Aiohttp helped me implement the ClientSession. The asyncio helped to implement await functionality.
Due to the asynchronous nature of the functions I implemented and the time limits, I could not generate proper unit tests.
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
In the backend part, the dockerfile could not initialize the database as expected due to the missing migrations. I investivated about what could be the cause and realised that Django holds the migrations data in the generated database. So if we dont delete the records for migrations, the newly generated migrations stays with the same ids and could not be applied. I changed the database records to imply the changes. Also there was a problem in new profile creation process and I gave some ideas about what could be the solution.
Starting to use a new framework in a short amount of time was a challenge for most of the team like me. Also semantic search was a totally new concept for me and I had no prior experience. So implementing such a functionality was a real challenge for me. It took so much time at our initial solution to find the relevant data. Performance optimization was fun to work on but still a challenging one. All of them should be completed in short time and most of the time I was in sick leave. It was so challenging for me to work in such a tough time.
- Creating post, like, comment, profile, image endpoints
- Linking some of the models and created additional endpoints based on the request from frontend and mobile team
- Unit testing profile, post and comment serializers
- All the endpoints I created are in this PR
- Unit tests for profile, post and comment serializers are in this PR
- Restructring backend project for new endpoints
This is the link to my wikipage.
I implemented endpoints of these models :
- Profile
- Post
- Comment
- Like
- Bookmark
- Label
- Image
The functionality of these endpoints are not limited to get and post methods for the mentioned models. There are also endpoints such as following/unfollowing users liking/unliking posts etc. Based on the feedbacks from frontend and mobile team, I revised the endpoints and changed or detailed some response objects and models. The development process for these endpoints can be tracked at issue#63, issue#113 and PR#103. The endpoints can be comprehensively examined in the API Documentation.
Because of time limitations, I could not finish all the unit tests for the endpoints I mentioned above. I wrote unit tests for the serializers of Profile, Post and Comment models. This can be tracked at issue#125 and PR#126. Here is a code snippet for testing comment serializer. First, we set up the test class by preparing test data.
def setUp(self):
# Create test data
self.user = User.objects.create(username='testuser')
self.profile = Profile.objects.get(username=self.user)
self.image = Image.objects.create(url='example.com/image.jpg')
self.post = Post.objects.create(title='Test Post', content='Test Content', profile=self.profile, image=self.image)
self.comment = Comment.objects.create(post=self.post, profile=self.profile, content='Test Comment')
self.serializer = CommentRetrieveSerializer(instance=self.comment)
Now that the data is ready we can write our tests. I check two things. First of all, comment serializer must contain the attributes that are expected from it.
def test_contains_expected_attributes(self):
data = self.serializer.data
self.assertEqual(set(data.keys()), {'id', 'content', 'profile', 'created_at', 'updated_at', 'post'})
Then we can check if the values of the attributes of serializer are the same as Comment object created in setUp function.
def test_contains_expected_data(self):
data = self.serializer.data
excluded_fields = ['created_at', 'updated_at']
for key, value in data.items():
if key not in excluded_fields:
if key == 'profile':
self.assertEqual(value['username'], self.profile.username.username)
elif key == 'post':
self.assertEqual(value, self.post.id)
else:
self.assertEqual(value, getattr(self.comment, key))
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
I almost always used undockerized version and my local database while developing the endpoints for convenience. Other team members took care of dockerization, deployement etc.
Learning a new framework in a short amount of time was definitely a challenge for me. Also, anticipating what kind of data client-side would want from the endpoints was challenging at first. But after clarifying and communicating, it became much easier.
- Improve search time delay using asynchronous functions
- Implementing new search method
- Creating Unit Test for Search Methods
PR For the search part we had long work session with Asım, around 6 consecutive hours of work. Near the end of the work I had a bug for the query parameters of sparql and could not solved it, Asım debugged it and solved, and thanks to him code did work, that is why Asım created PR. Unfortunately we could not divide the whole searching part to smaller issues, I am certainly aware that it is wrong and not the best way to work collaboratively, but time constraint lead us to this mistake.
This is the link to my wikipage.
I implemented endpoint of the search funtion with Asım. We started with naive approach to the seraching part using sparql api, we searched all possible outputs for the given keywords, but as expected it lasted too long. To make it faster, Asım suggested using making the searching part asynchronous because we had different queries for a single search. The search function first returns the paintings asynchronously and after that, searchs for others to return the related results. This saved the searching part a lot of time, around 60-70%.
2 libraries beside sparql, in python:
- asyncio for async approach.
- aiohttp for client session
With Asım, we tried to make some unit tests that at least runs correctly on the queries and response however the async nature of the code made it difficult and unit testes did not run correctly.
@pytest.mark.asyncio
async def test_get_paintings_by_creator():
session = AsyncMock()
creator_id = "Q123"
creator_label = "Some Artist"
mock_response = {
"results": {
"bindings": [
{
"itemLabel": {"value": "Some Painting"},
"image": {"value": "http://example.com/somepainting.jpg"},
"creatorLabel": {"value": "Some Artist"},
"creator": {"value": "http://www.wikidata.org/entity/Q123"},
"genreLabel": {"value": "Some Genre"},
"materialLabel": {"value": "Some Material"}
},
{
"itemLabel": {"value": "Another Painting"},
"image": {"value": "http://example.com/anotherpainting.jpg"},
"creatorLabel": {"value": "Some Artist"},
"creator": {"value": "http://www.wikidata.org/entity/Q123"},
"genreLabel": {"value": "Another Genre"},
"materialLabel": {"value": "Another Material"}
}
]
}
}
mock_consolidated_response = mock_response['results']["bindings"]
with patch("your_module_name.fetch_sparql", return_value=mock_response) as mock_fetch_sparql:
with patch("your_module_name.consolidate_art_data", return_value=mock_consolidated_response) as mock_consolidate_art_data:
results = await get_paintings_by_creator(session, creator_id, creator_label)
assert results is not None
assert isinstance(results, list)
assert all(isinstance(result, dict) for result in results)
for result in results:
assert "itemLabel" in result
assert "image" in result
assert "creatorLabel" in result
assert "creator" in result
assert "genreLabel" in result
assert "materialLabel" in result
@pytest.mark.asyncio
async def test_get_paintings_by_creator_empty_result():
session = AsyncMock()
creator_id = "Q123"
creator_label = "Some Artist"
mock_response = {
"results": {
"bindings": []
}
}
with patch("your_module_name.fetch_sparql", return_value=mock_response) as mock_fetch_sparql:
with patch("your_module_name.consolidate_art_data", return_value=mock_response['results']["bindings"]) as mock_consolidate_art_data:
results = await get_paintings_by_creator(session, creator_id, creator_label)
assert results == []
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
Despite I did not manage the docker part, I helped my friends on docker configuration when docker did not work properly, helped them to debug. Other than that I did not worked on this part.
I started to learn backend with this project and had not enough experience to start a such collaborative project, also learning a new framework in the area I have never worked on made it challenging for me. In the development process wikidata sparql queries was hard due to parameters. Making the the search 'semantic' made it harder due to increase in the number of queries, splitting and joining the results. Especially showing the search result on frontend part, standardizing the search results was also hard. After achieving this goals, we had another problem which is slow search, the search process was not fast enough for an user, so we made all those work with async approach again, debugging this part was also hard for me because of async functions. And the last challenge was approach of deadline, time constraint made me a bit panic but tried to handle as much as I can.
- Improve Semantic Search Feature
- Render Search Results In PostCard For Mobile App
- Create SearchResult and PostView component For Frontend
- Issue#98 frontend post and search
- Issue#104 render search results in postcard for mobile app
- Issue#106 access profile page via feed page
This is the link to my wikipage.
Collaborating with Mert Cengiz, we have integrated the third party Wikidata API, which is the basis of the whole search feature in Artifact. We have created python API functions that fetches results from Wikidata API based on the painting name, genre name, movement name, creator name along with a related search function that operates in a semantic way. The call transcript of this third-party API is the text that is provided to search, and the response is the JSON objects including item label, genre label, material label, creator label and image fields which later will be rendered in according search result cards.
I've created a unit test for the sign up and login views of the backend but unfortunately an error has occurred therefore we didn't include it in the released version of our repo. For this reason, I'm not including it in here.
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
Even though Mert Cengiz and I created the initial models for our ORM model and database, I wasn't involved with containerizing the database and connecting the ports in the docker compose. Other team members took care of dockerization and deployment.
While collaborating with Mert Cengiz in improving semantic search feature, handling properties starting with both lowercase and capital letters was a challenge for us. At first, I implemented REGEX to address this issue that works in a case insensitive way while looking for the matches where entered keyword is enclosed with a non alphabetic character in both ends ([^a-zA-Z]). Even though it returned a wide range of results, it introduced a significant overhead to the search functionality therefore we worked on converting it to the form where case insensitivity is only applied to the space-separated words of the query keyword with Deniz Bilge Akkoç. This was a faster approach if not the fastest yet the finalized version was implemented by Asım Dağ and Mustafa Ocak with an asyncronous manner which completely solved the issue for us. While working on the mobile part, I came across a problem where bottom tab navigator was overflowed in the browser display by the above content even though I applied fixed display to the tab contanier and scrollable view to the upper content. Expo did not open our mobile app for a while and therefore we couldn't reach the accurate mobile display of the app which slowed down the progress of our mobile app significantly.
Wiki documentation of Dağlar Eren Tekşen is available here
Considering the endpoints that the backend people has done, I began to search for material in order to use in profiles, posts, comments etc. By the help of the Postman tool, I came up with 7 profiles with username and password. Then, I patched the other fields which are profile photo and bio. Then, I came up with two scenerios:
- The user, Daglar, sees an Edward Hopper painting shared by InkSplashMike, comments on the post and learns more about the painter. He posts a bunch of Edward Hopper paintings.
- The user, CreativeJohnDoe, sees The Starry Night painting shared by ArtisticJennifer at MoMA in New York, comments and learn about the museum. Then, he visits the museum and posts the paintings over there.
There is no unit tests for mock data and scenerios. The data is in the database and can be viewed.
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
I always used undockerized version and my local database while developing the mock data for convenience. Other team members took care of dockerization, deployement etc.
I have difficult time about finding paintings. Then, I thought of scenerios first. That makes easier to find paintings the users posted.
- Add Label, Comment, and PostCard Components. Add and Navigate to PostViewPage
- Render Search Results in PostCard for Mobile App
- Access Profile Page via Feed Page
Wiki documentation of Hanaa Zaqout is available here
I did not make any contribution in this part.
The test ensures that the frontend of the React Native mobile app correctly passes data to the PostViewPage component through the route params ensuring data flow between different mobile components. It sets up a mock route object containing sample post data, renders the PostViewPage component with this mock route object, and then asserts that specific text elements representing the post details are present in the rendered component.
import React from 'react';
import { render } from '@testing-library/react-native';
import PostViewPage from '../PostViewPage';
describe('PostViewPage', () => {
it('renders post details correctly', () => {
const mockRoute = {
params: {
post: {
imageURL: 'https://upload.wikimedia.org/wikipedia/commons/1/1c/Office-at-night-edward-hopper-1940.jpg',
title: 'Office at Night',
creator: 'Edward Hopper',
material: 'Oil',
genre: 'Portrait',
},
},
};
const { getByText } = render(<PostViewPage route={mockRoute} />);
expect(getByText('Office at Night')).toBeTruthy();
expect(getByText('Edward Hopper')).toBeTruthy();
expect(getByText('Oil')).toBeTruthy();
expect(getByText('Portrait')).toBeTruthy();
});
it('renders default post details when post object is not provided', () => {
const { getByText } = render(<PostViewPage />);
// post object not provided
expect(getByText('')).toBeTruthy();
expect(getByText('')).toBeTruthy();
expect(getByText('')).toBeTruthy();
expect(getByText('')).toBeTruthy();
expect(getByText('')).toBeTruthy();
});
});
Description of Other Significant Work (Docker Configurations, AWS, Database, or Framework Adjustments)
I was focused mainly on frontend development for the mobile app. I used Docker to connect frontend to backend but faced laptop freezing issues. Also, I looked into integrating backend endpoints for storing data from AddPost interface into the database with Samet.
Eren noticed the bug that ViewPostPage was showing in wrong unexpected places while navigating through the mobile application. I solved this bug by handling the navigation param in the app correctly. Also, I spent some time debugging data flow in the application. When clicked at postCard, data from postCard is expected to be passed and shown on PostViewPage but it was not. I solved the problem using route params in postViewPage. Moreover, we faced bug in mobile app that navgation tap was not showing properly and was covered by other components in the page. I suggested to place navigation tap at the top if the page (last moment solutions). Our semantic search was slow, and we were discussing how to improve it. I discussed with Deniz Bilge on three possible approaches: asynchronous methods, loops, and regex. Deniz and the rest of the team dedicated additional time to implementing these solutions based on our discussions.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Deniz Bilge Akkoç, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Abdulsamet Alan, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Mert Cengiz, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Oğuz Hekim, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Eren Pakelgil, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Dağlar Eren Tekşen, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Hanaa Zaqout, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
Related to the submission of all the project deliverables for the project of the CMPE 352 course, during the 2024 Spring semester, reported in this report, I, Mustafa Ocak, declare that:
-
I am a student in the Computer Engineering program at Boğaziçi University and am registered for the CMPE 352 course during the 2024 Spring semester.
-
All the material that I am submitting related to my project (including but not limited to the project repository, the final project report, and supplementary documents) have been exclusively prepared by myself.
-
I have prepared this material individually without the assistance of anyone else with the exception of permitted peer assistance which I have explicitly disclosed in this report.
-
📝 Plan
-
📝 Project
-
📝 Customer Milestone Reports
-
✨ Team Members
-
📋 Templates
Cmpe 352 Archive
-
🔍 Researches
-
📝 Project