Aggregate [Back-End]: This is the full Golang backend code for the AGGREGATE Project 🚀🌐
You are currently in the BackEnd section, To view the FrontEnd go here
Aggregate is a content aggregation platform designed to streamline information consumption. Its purpose is to centralize feeds from various sources—such as RSS and Atom—into a unified stream. Users can effortlessly follow their favorite content, whether it’s news, blogs, or other updates. The project emphasizes efficiency, security, and a user-friendly experience, making it a valuable tool for staying informed in today’s fast-paced digital landscape. 🚀🌐
This will be a high-level list of the features this API supports. For a more detailed and in-depth list of features, you can visit the frontend here.
Some of the features include:
Administration\Admin endpoints:
More capabilities are in the pipeline including feed and post management as well as Moderation
Permissions:
{comment:write}
and {comment:read}
, if a moderator bans a user, their commen:write
permission maybe removed, and thus the users
replies and comments will not be reflected.Scraper:
Sanitization:
Panic, Shutdown, and Recovery:
CORS Management:
Metrics:
Mailer Support:
smtp
settings as shown in the flags
section.Rate Limiter:
flags
section.Custom Login and Authentication:
Payment Client Support:
Customization & Flexibility:
Front-End Support and Flexibility:
In The Works:
NB: For a full list of all the capabilities and expressed better in a frontend, please VISIT the aggregate frontend, linked above this page.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
What things you need to install the software and how to install them. See deployment for notes on how to deploy the project on a live system.
Before you can run or contribute to this project, you'll need to have the following software installed:
Clone the repository: Start by cloning the repository to your local machine. Open a terminal, navigate to the directory where you want to clone the repository, and run the following command:
git clone https://github.com/blue-davinci/aggregate.git
Navigate to the project directory: Use the cd
command to navigate to the project directory:
cd aggregate
Install the Go dependencies: The Go tools will automatically download and install the dependencies listed in the go.mod
file when you build or run the project. To download the dependencies without building or running the project, you can use the go mod download
command:
go mod download
Set up the database: The project uses a PostgreSQL database. You'll need to create a new database and update the connection string in your configuration file or environment variables.
We use GOOSE
for all the data migrations and SQLC
as the abstraction layer for the DB. To proceed
with the migration, navigate to the Schema
director:
cd internal\sql\schema
goose {connection string} up
to execute an Up migration as shown:goose postgres postgres://aggregate:password@localhost/aggregate up
Build the project: You can build the project using the makefile's command:
make build/api
This will create an executable file in the current directory. Note: The generated executable is for the windows environment - However, You can find the linux build command within the makefile!
Run the project: You can run the project using the go run
or use MakeFile
and do:
make run/api
MakeFile Help: For additional supported commands run make help
:
make help
Output:
make help
Usage:
run/api - run the api application
db/psql - connect to the db using psql
build/api - build the cmd/api application
audit - tidy dependencies and format, vet and test all code
db/migrations/up - run the up migrations using confirm as prerequisite
vendor - tidy and vendor dependencies
The application accepts command-line flags for configuration, establishes a connection pool to a database, and publishes variables for monitoring the application. The published variables include the application version, the number of active goroutines and the current Unix timestamp.
http://localhost:4000
.You can view the parameters by utilizing the -help
command. Here is a rundown of
the available commands for a quick lookup.
HTML
and all their attributes
. The default is false for a medium balance.Using make run
, will run the API with a default connection string located
in cmd\api\.env
. If you're using powershell
, you need to load the values otherwise you will get
a cannot load env file
error. Use the PS code below to load it or change the env variable:
$env:AGGREGATE_DB_DSN=(Get-Content -Path .\cmd\api\.env | Where-Object { $_ -match "AGGREGATE_DB_DSN" } | ForEach-Object { $($_.Split("=", 2)[1]) })
Alternatively, in unix systems you can make a .envrc file and load it directly in the makefile by importing like so:
include .envrc
A succesful run will output something like this:
make run/api
'Running cmd/api...'
go run ./cmd/api
{"level":"INFO","time":"2024-08-26T16:10:34Z","message":"Loading Environment Variables","properties":{"DSN":"cmd\\api\\.env"}}
{"level":"INFO","time":"2024-08-26T16:10:34Z","message":"database connection pool established"}
{"level":"INFO","time":"2024-00-26T16:00:34Z","message":"Starting RSS Feed Scraper","properties":{"Client Timeout":"15","Interval":"40s","No of Go Routines":"5","No of Retries":"3"}}
{"level":"INFO","time":"2024-00-26T16:00:34Z","message":"Starting autosubscription jobs...","properties":{"Interval":"720"}}
Below are most of the accepted flags in the API and a high level description of what they do. To view the comprehensive list please run the application with the -help
flag:
GET /v1/healthcheck: Checks the health of the application. Returns a 200 OK status code if the application is running correctly.
POST /v1/users: Registers a new user.
PUT /v1/users/activated: Activates a user.
POST /v1/api/authentication: Creates an authentication token.
GET /debug/vars: Provides debug variables from the expvar
package.
POST /feeds: Add an RSS Type feed {Atom/RSS}
GET /feeds?page=1&page_size=30: Get all Feeds in the DB, With Pagination Note: You can leave the pagination parameters foe default values!
POST /feeds/follows: Follow any feed for a user
GET /feeds/follow: Get all feeds followed by a user. Supports pagination and search.
DELETE /feeds/follow/{feed_id}: Unfollow a followed feed
GET /feeds: Get all Posts from scraped feeds that are followed by a user. Supports pagination and search.
POST /password-reset: Initial request for password reset that sends a validation tokken
PUT /password: Updates actual password after reset.
GET /notifications: Retrieve notifications on per user basis. Current implimentation supports polling and on-demand basis
GET /feeds/favorites: Get favorite feeds for a user. Supports pagination and search.
POST /feeds/favorites: Add a new favorite post
DELETE /feeds/favorites: Deletes/Remove a favorited post
GET /feeds/favorites/post: Gets detailed post infor for only favorited posts i.e Can see any favorited content. Supports pagination and search.
GET /feeds/follow/list: Gets the list of all feeds followed by a user.Supports pagination and search.
GET /feeds/follow/posts/comments: Gets all comments related to a specific post
POST /feeds/follow/posts/comments: Post a comment from a user
PATCH /feeds/created/{}: Feed Manager. Allows a user to edit a feed they created. Allows hiding and unhiding of created feeds.
GET /feeds/created: Feed Manager. Get all feeds created by a user as well as related statistics such as follows and ratings.
GET /follow/posts/comments/{postID}: Get all comments for a particular post
DELETE /follow/posts/comments/{postID}: Remove/clear a comment notification
POST /follow/posts/comments: Add a comment to an existing post.
GET /follow/posts/{postID}: Get the data around and on a specific rss feed post. Works in tandem with the share functionality.
GET /feeds/sample-posts/{feedID}: Get sample random posts for specified posts to demonstrate a "taste" of what they look like.
GET /top/creators: Get the top x users of the API. it uses an algorithm explained in this readme.
GET /search-options/feeds: Get all available feeds to populate your search filter.
GET /search-options/feed-type: Get all available feed-type to populate your search filters.
GET /subscriptions: Get all transactional/subscriptional data for a specific users
POST /subscriptions/initialize: Initializes a subscription intent, which will return a redirect to the payment gateway
POST /subscriptions/verify: Verifies a transation made by a specific user via the gateway sent back from the init request
GET /subscriptions/plans: Gets back all subscription plans supported by the app.
POST /subscriptions/plans: Add a subscription plan including details such as features, prices and more.
GET /subscriptions/challenged: A poll endpoint to check whether a user has a challenged subscription transaction.
PATCH /subscriptions/challenged: Update challenged transactions. Doing this will only delay a recurring charge
PATCH /subscriptions: Allows a user to cancel an existing subscription, preventing any further recurring charges.
POST /api/activation: Allows a manual request for a new Reset token email for new registered users
The project has existing tests represented by files ending with the word "_test"
e.g rssdata_test.go
Each test file contains a myriad of tests to run on various entities mainly functions.
The test files are organized into structs of tests
and their corresponding test logic.
You can run them directly from the vscode test UI. Below represents test results for the scraper:
=== RUN Test_application_rssFeedScraper
=== RUN Test_application_rssFeedScraper/Test1
Fetching: bbc
--- PASS: Test_application_rssFeedScraper/Test1 (0.00s)
=== RUN Test_application_rssFeedScraper/Test2
Fetching: Lane's
--- PASS: Test_application_rssFeedScraper/Test2 (0.00s)
=== RUN Test_application_rssFeedScraper/Test3
Fetching: Megaphone
--- PASS: Test_application_rssFeedScraper/Test3 (0.00s)
=== RUN Test_application_rssFeedScraper/Test4
Fetching: Daily Podcast
--- PASS: Test_application_rssFeedScraper/Test4 (0.00s)
=== RUN Test_application_rssFeedScraper/Test5
Fetching: Endagadget
--- PASS: Test_application_rssFeedScraper/Test5 (0.00s)
--- PASS: Test_application_rssFeedScraper (0.00s)
PASS
ok github.com/blue-davinci/aggregate/cmd/api 0.874s
As earlier mentioned, the api uses a myriad of flags which you can use to launch the application.
An example of launching the application with your smtp server's setting
includes:
make build/api ## build api using the makefile
./bin/api.exe -smtp-username=pigbearman -smtp-password=algor ## run the built api with your own values
Direct Run:
go run main.go
We calculate the score for each user a bit differently. Although v1 was a simple feed follows and creation division, we moved and now the algorithm looks like this:
score = (total feeds) / Σ(follows * w_f * e^(-λ * t_f) + likes * w_l * e^(-λ * t_l) + comments * w_c * e^(-λ * t_c) + R) * 100
Where:
User X:
Engagement Score:
Engagement Score = (100 * 0.7) + (50 * 0.3) + (20 * 0.2) = 70 + 15 + 4 = 89
Consistency Score:
Consistency Score = (10 / 30) = 0.33
(normalize to 0-100 range: 0.33 * 100 = 33)
Random Factor:
Random Factor = R (a small random value between 0 and 1)
Final Score:
Final Score = ((89 * 0.8) + (33 * 0.2) + R) = 71.2 + 6.6 + R = 77.8 + R
This application uses Pay-Stack to handle payments. For the setup, you will need to:
Register to paystack
Get the Paystack API
and add the following to the .env
file in the cmd\api
dir as below:
PAYSTACK_SECRET_KEY=xxxx-paystack-api-xxxxxxx
That is all you need for the setup. The paystack API works on the basis of an initialization and verification which can be done via a webhook
or poll
.
As it works in tandem with the app's Limitation parameters, you can change the parameters by using the limitation flags
already listed above.
Please Note: The application also supports payments through Mobile Money in addition to supported Cards.
This application can be deployed using Docker and Docker Compose. Here are the steps to do so:
cd aggregate
docker-compose.yml
file:docker compose up --build
Please remember you can use flags, mentioned here while running the api by setting them in
the Dockerfile
like so:
CMD ["./bin/api.exe", "-smtp-username", "smtp username", "-port", "your_port", "-smtp-password", "your_smtp_pass"]
Note: There is a pre-built package for anyone who may feel less enthusiastic about building it themselves. You can get it going by doing:
docker pull ghcr.io/blue-davinci/aggregate:latest
See also the list of contributors who participated in this project.
Aggregate was born from the necessity to streamline how we consume and manage the vast ocean of information available online. As avid tech enthusiasts and news followers, we often found ourselves juggling between multiple sources to stay updated.
This sparked the idea to create a unified platform where all our favorite feeds could be aggregated into one seamless experience. Thus, Aggregate was conceived with a mission to simplify and enhance the way we access information.