Compare commits

...

84 Commits

Author SHA1 Message Date
Carlos Quintana 9d2a35b9c2
fix: monitoring table name (#2120) 2024-05-24 11:09:10 +02:00
Carlos Quintana 5f190d4b46
fix: monitoring table name 2024-05-24 10:52:08 +02:00
Carlos Quintana 6862ed3602
fix: event listener (#2119)
* fix: commit transaction after taking event

* feat: allow to reconnect to postgres for event listener

* chore: log sync events pending to process to metrics

* fix: make dead_letter runner able to process events without needing to have lock on the event

* chore: close Session after reconnect

* refactor: make EventSource emit only events that can be processed
2024-05-24 10:21:19 +02:00
Carlos Quintana 450322fff1
feat: allow to disable event-webhook (#2118) 2024-05-23 16:50:54 +02:00
Carlos Quintana aad6f59e96
Improve error handling on event sink (#2117)
* chore: make event_sink return success

* fix: add return to ConsoleEventSink
2024-05-23 15:05:47 +02:00
Carlos Quintana 8eccb05e33
feat: implement HTTP event sink (#2116)
* feat: implement HTTP event sink

* Update events/event_sink.py

---------

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-05-23 11:32:45 +02:00
Carlos Quintana 3e0b7bb369
Add sync events (#2113)
* feat: add protocol buffers for events

* chore: add EventDispatcher

* chore: add WebhookEvent class

* chore: emit events

* feat: initial version of event listener

* chore: emit user plan change with new timestamp

* feat: emit metrics + add alias status to create event

* chore: add newrelic decorator to functions

* fix: event emitter fixes

* fix: take null end_time into account

* fix: avoid double-commits

* chore: move UserDeleted event to User.delete method

* db: add index to sync_event created_at and taken_time columns

* chore: add index to model
2024-05-23 10:27:08 +02:00
Son Nguyen Kim 60ab8c15ec
show app page (#2110)
Co-authored-by: Son NK <son@simplelogin.io>
2024-05-22 15:43:36 +02:00
Son Nguyen Kim b5b167479f
Fix admin loop (#2103)
* mailbox page requires sudo

* fix the loop when non-admin user visits an admin URL

https://github.com/simple-login/app/issues/2101

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-05-10 18:52:12 +02:00
Adrià Casajús 8f12fabd81
Make hibp rate configurable (#2105) 2024-05-10 18:51:16 +02:00
Daniel Mühlbachler-Pietrzykowski b6004f3336
feat: use oidc well-known url (#2077) 2024-05-02 16:17:10 +02:00
Adrià Casajús 80c8bc820b
Do not double count AlilasMailboxes with Aliases (#2095)
* Do not double count aliasmailboxes with aliases

* Keep Sl-Queue-id
2024-04-30 16:41:47 +02:00
Son Nguyen Kim 037bc9da36
mailbox page requires sudo (#2094)
Co-authored-by: Son NK <son@simplelogin.io>
2024-04-23 22:25:37 +02:00
Son Nguyen Kim ee0be3688f
Data breach (#2093)
* add User.enable_data_breach_check column

* user can turn on/off the data breach check

* only run data breach check for user who enables it

* add tips to run tests using a local DB (without docker)

* refactor True check

* trim trailing space

* fix test

* Apply suggestions from code review

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>

* format

---------

Co-authored-by: Son NK <son@simplelogin.io>
Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-04-23 22:16:36 +02:00
Adrià Casajús 015036b499
Prevent proton mailboxes from enabling pgp encryption (#2086) 2024-04-12 15:19:41 +02:00
Son Nguyen Kim d5df91aab6
Premium user can enable data breach monitoring (#2084)
* add User.enable_data_breach_check column

* user can turn on/off the data breach check

* only run data breach check for user who enables it

* add tips to run tests using a local DB (without docker)

* refactor True check

* trim trailing space

* fix test

* Apply suggestions from code review

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>

* format

---------

Co-authored-by: Son NK <son@simplelogin.io>
Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-04-12 10:39:23 +02:00
Adrià Casajús 2eb5feaa8f
Small improvements (#2082)
* Update logs with more relevant info for debugging purposes

* Improved logs for alias creation rate-limit

* Reduce sudo time to 120 secs

* log fixes

* Fix missing object to add to the session
2024-04-08 15:05:51 +02:00
Adrià Casajús 3c364da37d
Dmarc fix (#2079)
* Add log to spam check + remove invisible characters on import

* Update log
2024-03-26 11:43:33 +01:00
Adrià Casajús 36cf530ef8
Preserve X-SL-Queue-Id (#2076) 2024-03-22 11:00:06 +01:00
Adrià Casajús 0da1811311
Cleanup old data (#2066)
* Cleanup tasks

* Update

* Added tests

* Create cron job

* Delete old data cron

* Fix import

* import fix

* Added delete + script to disable pgp for proton mboxes
2024-03-18 16:00:21 +01:00
Adrià Casajús f2fcaa6c60
Cleanup also messsage-id headers from linebreaks (#2067) 2024-03-18 14:27:38 +01:00
Adrià Casajús aa2c676b5e
Only check HIBP alias of paid users (#2065) 2024-03-15 10:13:06 +01:00
Adrià Casajús 30ddd4c807
Update oneshot commands (#2064)
* Update oneshot commands

* fix

* Fix test_load command

* Rename to avoid test executing it

* Do HIBP breach check in batches instead of a single load
2024-03-14 16:03:43 +01:00
Son Nguyen Kim f5babd9c81
Move import export back to setting (#2063)
* replace black by ruff

* move alias import/export to settings

* fix html closing tag

* add rate limit for alias import & export

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-03-14 15:56:35 +01:00
Adrià Casajús 74b811dd35
Update oneshot commands (#2060)
* Update oneshot commands

* fix

* Fix test_load command

* Rename to avoid test executing it
2024-03-14 11:11:50 +01:00
martadams89 e6c51bcf20
ci: remove v7 (#2062) 2024-03-14 09:49:45 +01:00
martadams89 4bfc6b9aca
ci: add arm docker images (#2056)
* ci: add arm docker images

* ci: remove armv6

* ci: update workflow to run only on path

* Update .github/workflows/main.yml

---------

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-03-13 15:20:47 +01:00
Adrià Casajús e96de79665
Add missing indexes and mark aliases as created by partner (#2058)
* Add missing indexes and mark aliases as created by partner

* Configure if we should skip the partner aliases or not
2024-03-13 14:30:17 +01:00
Daniel Mühlbachler-Pietrzykowski a608503df6
feat: add generic OIDC connect (#2046) 2024-03-13 14:30:00 +01:00
Son Nguyen Kim 0c3c6db2ab
point to the new safari extension (#2059)
Co-authored-by: Son NK <son@simplelogin.io>
2024-03-12 23:05:54 +01:00
Adrià Casajús 9719a36dab
Do not replace unsubs that go to UNSUBSCRIBER (#2051) 2024-03-06 16:26:10 +01:00
Adrià Casajús a7d4bd15a7
If the transactional_id is None do nothing 2024-03-04 17:47:06 +01:00
Adrià Casajús 565f6dc142
If there is no transactional id, skip it (#2047) 2024-03-04 17:44:19 +01:00
Adrià Casajús 76423527dd
Update HIBP async script (#2043)
* Update HIBP async script

* Fix: continue instead of return
2024-03-04 13:12:38 +01:00
Adrià Casajús 501b225e40
Require sudo for account changes (#2041)
* Move accounts settings under sudo

* Fixed sudo mode

* Add a log message

* Update test

* Renamed sudo_setting to account_setting

* Moved simple login data export and alias/import export to account settings

* Move account settings to the top-right dropdown
2024-02-29 11:20:29 +01:00
Adrià Casajús 1dada1a4b5
Allow to skip creating transactional emails (#2042) 2024-02-27 16:52:45 +01:00
Adrià Casajús 37f227da42
Fix format 2024-02-27 09:41:47 +01:00
Adrià Casajús 97e68159c5
Fix: Never use NOREPLY to create contacts (#2039) 2024-02-27 09:29:53 +01:00
Adrià Casajús 673e19b287
Sanitize unused next parameter (#2040) 2024-02-26 19:23:03 +01:00
Sukuna 5959d40a00
Added comments to test_login.py (#2035)
* Added comments to test_login.py

-Added comments to each test function to provide clear documentation of the test steps.
-Comments detail the purpose of each test, the actions taken, and the expected outcomes.
-Improved readability and maintainability of the test suite.
-No changes in functionality; only added comments for better code understanding.

* Removed comments from import file in test_login.py
2024-02-26 17:41:54 +01:00
Adrià Casajús 173ae6a221
Allow to soft-delete users (#2034)
* Allow the possibility of soft-deleting users

* Unschedule for delete after link

* Add dry run to the cron
2024-02-22 17:38:34 +01:00
Adrià Casajús eb92823ef8
Removed potentially nsfw words 2024-02-20 11:17:28 +01:00
Adrià Casajús 363b851f61
Fix: use proper bucket time for the rate limit 2024-02-20 11:13:06 +01:00
Adrià Casajús d0a6b8ed79
Add start and end flags to parallelize call (#2033) 2024-02-19 16:46:35 +01:00
Adrià Casajús 50c130a3a3
Store the latest email_log id in the alias to simplify dashboard query (#2022)
* Store the latest email_log id in the alias to simplify dashboard query

* Fix test

* Add script to migrate users last email_log_id to alias

* Always update the alias last_email_log_id automatically

* Only set the alias_id if it is set

* Fix test with randomization

* Fix notification test

* Also remove explicit set on tests

* Rate limit alias creation to prevent abuse (#2021)

* Rate limit alias creation to prevent abuse

* Limit in secs

* Calculate bucket time

* fix exception

* Tune limits

* Move rate limit config to configuration (#2023)

* Fix dropdown item in header (#2024)

* Add option for admin to stop trial (#2026)

* Fix: if redis is not configured do not enable rate limit (#2027)

* support product IDs for the new Mac app (#2028)

Co-authored-by: Son NK <son@simplelogin.io>

* Add metrics to rate limit (#2029)

* Order domains alphabetically when retrieving them (#2030)

* Removed unused import

* Remove debug info

---------

Co-authored-by: D-Bao <49440133+D-Bao@users.noreply.github.com>
Co-authored-by: Son Nguyen Kim <son.nguyen@proton.ch>
Co-authored-by: Son NK <son@simplelogin.io>
2024-02-15 15:48:02 +01:00
Adrià Casajús b462c256d3
Order domains alphabetically when retrieving them (#2030) 2024-02-08 15:36:06 +01:00
Adrià Casajús f756b04ead
Add metrics to rate limit (#2029) 2024-02-06 11:55:45 +01:00
Son Nguyen Kim 05d18c23cc
support product IDs for the new Mac app (#2028)
Co-authored-by: Son NK <son@simplelogin.io>
2024-02-06 11:54:02 +01:00
Adrià Casajús 4a7c0293f8
Fix: if redis is not configured do not enable rate limit (#2027) 2024-02-05 14:53:01 +01:00
Adrià Casajús 30aaf118e7
Add option for admin to stop trial (#2026) 2024-02-05 13:47:39 +01:00
D-Bao 7b0d6dae1b
Fix dropdown item in header (#2024) 2024-02-02 10:23:05 +01:00
Adrià Casajús b6f1cecee9
Move rate limit config to configuration (#2023) 2024-02-01 14:47:15 +01:00
Adrià Casajús d12e776949
Rate limit alias creation to prevent abuse (#2021)
* Rate limit alias creation to prevent abuse

* Limit in secs

* Calculate bucket time

* fix exception

* Tune limits
2024-01-30 18:29:59 +01:00
Ed b8dad2d657
Update README.md (#2011)
Updated README.md to prevent Nginx redirecting the browser to the local address of the machine
2024-01-26 10:30:16 +01:00
Son Nguyen Kim 860ce03f2a
fix footer spacing again (#2018)
Co-authored-by: Son NK <son@simplelogin.io>
2024-01-26 10:27:57 +01:00
Son Nguyen Kim 71bb7bc795
fix space issue on footer (#2017)
Co-authored-by: Son NK <son@simplelogin.io>
2024-01-23 14:57:58 +01:00
Adrià Casajús 761420ece9
Prevent mailboxes that have been disabled from being used again (#2016)
* Prevent mailboxes that have been disabled from being used again

* Improve test

* Get one user since it will be unique
2024-01-23 14:57:40 +01:00
Adrià Casajús c3848862c3
Fix: limit the id sizes we generate and remove spaces after unidecode 2024-01-22 17:42:58 +01:00
Adrià Casajús da09db3864
Do not allow free users to create reverse alias to reduce abuse (#2013)
* Do not allow free users to create reverse alias to reduce abuse

* Update format

* Move function under user

* Update tests
2024-01-16 14:51:01 +01:00
Adrià Casajús 44138e25a5
Fix: Dedup the list of mailboxes for an alias (#2010) 2024-01-16 14:50:39 +01:00
Adrià Casajús b541ca4ceb
Fix typo in email (#2008) 2024-01-10 10:30:16 +01:00
Revi99 66c18e2f8e
small fixes (#2001)
* add forum mention

* add forum mention

* Add forum mention

* add forum mention

* fix my mistake

* fix
2024-01-08 21:40:52 +01:00
Son Nguyen Kim 4a046c5f6f
fix error when user logs out, go back to /dashboard and has the server error (#2003)
* fix error when user logs out, go back to /dashboard and has the server error

* reformat files. Not run ruff on migrations/ and .venv

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-01-05 14:30:07 +01:00
Ueri8 a731bf4435
Small fixes due to SL moving to Switzerland (#1999)
* Fix footer due to recent changes

* Fix forum mention
2024-01-04 12:07:59 +01:00
SecurityGuy f3127dc857
Generate working DKIM keys by adding -traditional flag and update NGINX instructions to avoid breaking certbot (#1989)
* Update README.md

Add -traditional option to openssl genrsa to avoid Python DKIM library (dkimpy) error that prevents email from being sent:

dkim.asn1.ASN1FormatError: Unexpected tag (got 30, expecting 02)

Ref: https://bugs.launchpad.net/dkimpy/+bug/1708917

* Update NGINX instructions

Include warning to delete /etc/nginx/sites-enabled/default to avoid a conflict that breaks certbot.
2024-01-03 14:08:39 +01:00
Kelp8 d9d28d3c75
I don’t think we really need that (#1992) 2024-01-03 14:08:05 +01:00
Kelp8 bca6bfa617
Remove sensitive words (#1994) 2024-01-03 12:38:13 +01:00
Agent-XD 5d6a4963a0
Small fixes (#1991)
* remove proprietary mention

* Add forum mention

* Sync activation.txt with activation.html

* Add subdomain information

* Make info look better

* Fix wording
2024-01-03 12:35:42 +01:00
Kelp8 00737f68de
Minor wordings change (#1985)
* Wording changes

* Add information to avoid being put in SPAM

* Remove word repeating

* Add forum mention

* Add forum mention to header.html

* Add info to avoid person marking as SPAM
2024-01-02 13:20:48 +01:00
Joseph Demcher 9ae206ec77
correct alias version in api.md (#1981) 2024-01-02 12:35:56 +01:00
Agent-XD 9452b14e10
Remove sensitive words (#1983)
* Remove sensitive words from test_words.txt

* Remove sensitive from words.txt

* remove sensitive from words_alpha.txt

* Update test_words.txt
2024-01-02 12:33:27 +01:00
Son Nguyen Kim 7705fa1c9b
reduce rate limit on /v2/aliases endpoint (#1979)
Co-authored-by: Son NK <son@simplelogin.io>
2023-12-27 16:42:58 +01:00
Adrià Casajús 1dfb0e3356
Require CSRF check on custom alias creation (#1977) 2023-12-20 16:15:01 +01:00
Adrià Casajús 2a9c1c5658
Increase limit for the dashboard and do it by user 2023-12-19 17:27:55 +01:00
Carlos Quintana dc39ab2de7
chore: remove verbose log (#1971) 2023-12-15 10:39:02 +01:00
Adrià Casajús fe1c66268b
Allow to use another S3 provider (#1970) 2023-12-14 15:55:37 +01:00
Adrià Casajús 72041ee520
Show BF banner until end of promotion (#1953) 2023-11-30 11:48:55 +01:00
Adrià Casajús f81f8ca032
Further limit the index endpoint (#1950) 2023-11-21 17:44:33 +01:00
Adrià Casajús 31896ff262
Replace black and flake8 with ruff (#1943) 2023-11-21 16:42:18 +01:00
Adrià Casajús 45575261dc
Rate limit index endpoint (#1948) 2023-11-21 14:42:24 +01:00
Adrià Casajús 627ad302d2
Creating account via partner also canonicalizes email (#1939) 2023-11-08 09:58:01 +01:00
Son NK 08862a35c3 fix image size 2023-11-07 14:33:46 +01:00
Son Nguyen Kim 75dd3cf925
admin can clone newsletter (#1938)
* admin can clone newsletter

- remove unique constraint on newsletter subject
- admin can clone newsletter

* update coupon image

---------

Co-authored-by: Son NK <son@simplelogin.io>
2023-11-07 14:16:03 +01:00
Son Nguyen Kim a097e33abe
black friday 2023 (#1937)
Co-authored-by: Son NK <son@simplelogin.io>
2023-11-07 13:53:28 +01:00
165 changed files with 3882 additions and 371057 deletions

View File

@ -1,7 +1,6 @@
name: Test and lint
on:
push:
on: [push, pull_request]
jobs:
lint:
@ -139,6 +138,12 @@ jobs:
with:
fetch-depth: 0
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Create Sentry release
uses: getsentry/action-release@v1
env:
@ -158,6 +163,7 @@ jobs:
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}

View File

@ -7,18 +7,19 @@ repos:
hooks:
- id: check-yaml
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
hooks:
- id: flake8
- repo: https://github.com/Riverside-Healthcare/djLint
rev: v1.3.0
hooks:
- id: djlint-jinja
files: '.*\.html'
entry: djlint --reformat
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.5
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format

View File

@ -68,6 +68,12 @@ For most tests, you will need to have ``redis`` installed and started on your ma
sh scripts/run-test.sh
```
You can also run tests using a local Postgres DB to speed things up. This can be done by
- creating an empty test DB and running the database migration by `dropdb test && createdb test && DB_URI=postgresql://localhost:5432/test alembic upgrade head`
- replacing the `DB_URI` in `test.env` file by `DB_URI=postgresql://localhost:5432/test`
## Run the code locally
Install npm packages
@ -151,10 +157,10 @@ Here are the small sum-ups of the directory structures and their roles:
## Pull request
The code is formatted using https://github.com/psf/black, to format the code, simply run
The code is formatted using [ruff](https://github.com/astral-sh/ruff), to format the code, simply run
```
poetry run black .
poetry run ruff format .
```
The code is also checked with `flake8`, make sure to run `flake8` before creating the pull request by

View File

@ -74,7 +74,7 @@ Setting up DKIM is highly recommended to reduce the chance your emails ending up
First you need to generate a private and public key for DKIM:
```bash
openssl genrsa -out dkim.key 1024
openssl genrsa -out dkim.key -traditional 1024
openssl rsa -in dkim.key -pubout -out dkim.pub.key
```
@ -510,11 +510,14 @@ server {
server_name app.mydomain.com;
location / {
proxy_pass http://localhost:7777;
proxy_pass http://localhost:7777;
proxy_set_header Host $host;
}
}
```
Note: If `/etc/nginx/sites-enabled/default` exists, delete it or certbot will fail due to the conflict. The `simplelogin` file should be the only file in `sites-enabled`.
Reload Nginx with the command below
```bash

View File

@ -5,10 +5,11 @@ from typing import Optional
from arrow import Arrow
from newrelic import agent
from sqlalchemy import or_
from app.db import Session
from app.email_utils import send_welcome_email
from app.utils import sanitize_email
from app.utils import sanitize_email, canonicalize_email
from app.errors import (
AccountAlreadyLinkedToAnotherPartnerException,
AccountIsUsingAliasAsEmail,
@ -131,8 +132,9 @@ class ClientMergeStrategy(ABC):
class NewUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
# Will create a new SL User with a random password
canonical_email = canonicalize_email(self.link_request.email)
new_user = User.create(
email=self.link_request.email,
email=canonical_email,
name=self.link_request.name,
password=random_string(20),
activated=True,
@ -166,7 +168,8 @@ class NewUserStrategy(ClientMergeStrategy):
class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
# IF it was scheduled to be deleted. Unschedule it.
self.user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
self.link_request, self.user, self.partner
)
@ -213,11 +216,21 @@ def process_login_case(
partner_id=partner.id, external_user_id=link_request.external_user_id
)
if partner_user is None:
canonical_email = canonicalize_email(link_request.email)
# We didn't find any SimpleLogin user registered with that partner user id
# Make sure they aren't using an alias as their link email
check_alias(link_request.email)
check_alias(canonical_email)
# Try to find it using the partner's e-mail address
user = User.get_by(email=link_request.email)
users = User.filter(
or_(User.email == link_request.email, User.email == canonical_email)
).all()
if len(users) > 1:
user = [user for user in users if user.email == canonical_email][0]
elif len(users) == 1:
user = users[0]
else:
user = None
return get_login_strategy(link_request, user, partner).process()
else:
# We found the SL user registered with that partner user id
@ -235,6 +248,8 @@ def link_user(
) -> LinkResult:
# Sanitize email just in case
link_request.email = sanitize_email(link_request.email)
# If it was scheduled to be deleted. Unschedule it.
current_user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
link_request, current_user, partner
)

View File

@ -46,7 +46,8 @@ class SLModelView(sqla.ModelView):
def inaccessible_callback(self, name, **kwargs):
# redirect to login page if user doesn't have access
return redirect(url_for("auth.login", next=request.url))
flash("You don't have access to the admin page", "error")
return redirect(url_for("dashboard.index", next=request.url))
def on_model_change(self, form, model, is_created):
changes = {}
@ -214,6 +215,20 @@ class UserAdmin(SLModelView):
Session.commit()
@action(
"remove trial",
"Stop trial period",
"Remove trial for this user?",
)
def stop_trial(self, ids):
for user in User.filter(User.id.in_(ids)):
user.trial_end = None
flash(f"Stopped trial for {user}", "success")
AdminAuditLog.stop_trial(current_user.id, user.id)
Session.commit()
@action(
"disable_otp_fido",
"Disable OTP & FIDO",
@ -611,6 +626,26 @@ class NewsletterAdmin(SLModelView):
else:
flash(error_msg, "error")
@action(
"clone_newsletter",
"Clone this newsletter",
)
def clone_newsletter(self, newsletter_ids):
if len(newsletter_ids) != 1:
flash("you can only select 1 newsletter", "error")
return
newsletter_id = newsletter_ids[0]
newsletter: Newsletter = Newsletter.get(newsletter_id)
new_newsletter = Newsletter.create(
subject=newsletter.subject,
html=newsletter.html,
plain_text=newsletter.plain_text,
commit=True,
)
flash(f"Newsletter {new_newsletter.subject} has been cloned", "success")
class NewsletterUserAdmin(SLModelView):
column_searchable_list = ["id"]

View File

@ -70,7 +70,6 @@ def verify_prefix_suffix(
# when DISABLE_ALIAS_SUFFIX is true, alias_domain_prefix is empty
and not config.DISABLE_ALIAS_SUFFIX
):
if not alias_domain_prefix.startswith("."):
LOG.e("User %s submits a wrong alias suffix %s", user, alias_suffix)
return False

View File

@ -25,6 +25,8 @@ from app.email_utils import (
render,
)
from app.errors import AliasInTrashError
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasDeleted, AliasStatusChange, EventContent
from app.log import LOG
from app.models import (
Alias,
@ -308,31 +310,36 @@ def delete_alias(alias: Alias, user: User):
Delete an alias and add it to either global or domain trash
Should be used instead of Alias.delete, DomainDeletedAlias.create, DeletedAlias.create
"""
# save deleted alias to either global or domain trash
LOG.i(f"User {user} has deleted alias {alias}")
# save deleted alias to either global or domain tra
if alias.custom_domain_id:
if not DomainDeletedAlias.get_by(
email=alias.email, domain_id=alias.custom_domain_id
):
LOG.d("add %s to domain %s trash", alias, alias.custom_domain_id)
Session.add(
DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
)
domain_deleted_alias = DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
)
Session.add(domain_deleted_alias)
Session.commit()
LOG.i(
f"Moving {alias} to domain {alias.custom_domain_id} trash {domain_deleted_alias}"
)
else:
if not DeletedAlias.get_by(email=alias.email):
LOG.d("add %s to global trash", alias)
Session.add(DeletedAlias(email=alias.email))
deleted_alias = DeletedAlias(email=alias.email)
Session.add(deleted_alias)
Session.commit()
LOG.i(f"Moving {alias} to global trash {deleted_alias}")
LOG.i("delete alias %s", alias)
Alias.filter(Alias.id == alias.id).delete()
Session.commit()
EventDispatcher.send_event(
user, EventContent(alias_deleted=AliasDeleted(alias_id=alias.id))
)
def aliases_for_mailbox(mailbox: Mailbox) -> [Alias]:
"""
@ -458,3 +465,15 @@ def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]):
alias.pinned = False
Session.commit()
def change_alias_status(alias: Alias, enabled: bool, commit: bool = False):
alias.enabled = enabled
event = AliasStatusChange(
alias_id=alias.id, alias_email=alias.email, enabled=enabled
)
EventDispatcher.send_event(alias.user, EventContent(alias_status_change=event))
if commit:
Session.commit()

View File

@ -16,3 +16,22 @@ from .views import (
sudo,
user,
)
__all__ = [
"alias_options",
"new_custom_alias",
"custom_domain",
"new_random_alias",
"user_info",
"auth",
"auth_mfa",
"alias",
"apple",
"mailbox",
"notification",
"setting",
"export",
"phone",
"sudo",
"user",
]

View File

@ -33,6 +33,9 @@ def authorize_request() -> Optional[Tuple[str, int]]:
if g.user.disabled:
return jsonify(error="Disabled account"), 403
if not g.user.is_active():
return jsonify(error="Account does not exist"), 401
g.api_key = api_key
return None

View File

@ -201,10 +201,10 @@ def get_alias_infos_with_pagination_v3(
q = q.order_by(Alias.pinned.desc())
q = q.order_by(latest_activity.desc())
q = list(q.limit(page_limit).offset(page_id * page_size))
q = q.limit(page_limit).offset(page_id * page_size)
ret = []
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in q:
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in list(q):
ret.append(
AliasInfo(
alias=alias,
@ -358,7 +358,6 @@ def construct_alias_query(user: User):
else_=0,
)
).label("nb_forward"),
func.max(EmailLog.created_at).label("latest_email_log_created_at"),
)
.join(EmailLog, Alias.id == EmailLog.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
@ -366,14 +365,6 @@ def construct_alias_query(user: User):
.subquery()
)
alias_contact_subquery = (
Session.query(Alias.id, func.max(Contact.id).label("max_contact_id"))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
.group_by(Alias.id)
.subquery()
)
return (
Session.query(
Alias,
@ -385,23 +376,7 @@ def construct_alias_query(user: User):
)
.options(joinedload(Alias.hibp_breaches))
.options(joinedload(Alias.custom_domain))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.join(EmailLog, Contact.id == EmailLog.contact_id, isouter=True)
.join(EmailLog, Alias.last_email_log_id == EmailLog.id, isouter=True)
.join(Contact, EmailLog.contact_id == Contact.id, isouter=True)
.filter(Alias.id == alias_activity_subquery.c.id)
.filter(Alias.id == alias_contact_subquery.c.id)
.filter(
or_(
EmailLog.created_at
== alias_activity_subquery.c.latest_email_log_created_at,
and_(
# no email log yet for this alias
alias_activity_subquery.c.latest_email_log_created_at.is_(None),
# to make sure only 1 contact is returned in this case
or_(
Contact.id == alias_contact_subquery.c.max_contact_id,
alias_contact_subquery.c.max_contact_id.is_(None),
),
),
)
)
)

View File

@ -24,12 +24,14 @@ from app.errors import (
ErrContactAlreadyExists,
ErrAddressInvalid,
)
from app.extensions import limiter
from app.models import Alias, Contact, Mailbox, AliasMailbox
@deprecated
@api_bp.route("/aliases", methods=["GET", "POST"])
@require_api_auth
@limiter.limit("10/minute", key_func=lambda: g.user.id)
def get_aliases():
"""
Get aliases
@ -72,6 +74,7 @@ def get_aliases():
@api_bp.route("/v2/aliases", methods=["GET", "POST"])
@require_api_auth
@limiter.limit("50/minute", key_func=lambda: g.user.id)
def get_aliases_v2():
"""
Get aliases
@ -181,7 +184,7 @@ def toggle_alias(alias_id):
if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403
alias.enabled = not alias.enabled
alias_utils.change_alias_status(alias, enabled=not alias.enabled)
Session.commit()
return jsonify(enabled=alias.enabled), 200

View File

@ -17,9 +17,14 @@ from app.models import PlanEnum, AppleSubscription
_MONTHLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.monthly"
_YEARLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.yearly"
# SL Mac app used to be in SL account
_MACAPP_MONTHLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.yearly"
# SL Mac app is moved to Proton account
_MACAPP_MONTHLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.yearly"
# Apple API URL
_SANDBOX_URL = "https://sandbox.itunes.apple.com/verifyReceipt"
_PROD_URL = "https://buy.itunes.apple.com/verifyReceipt"
@ -263,7 +268,11 @@ def apple_update_notification():
plan = (
PlanEnum.monthly
if transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)
@ -517,7 +526,11 @@ def verify_receipt(receipt_data, user, password) -> Optional[AppleSubscription]:
plan = (
PlanEnum.monthly
if latest_transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)

View File

@ -11,7 +11,7 @@ from itsdangerous import Signer
from app import email_utils
from app.api.base import api_bp
from app.config import FLASK_SECRET, DISABLE_REGISTRATION
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,

View File

@ -45,7 +45,7 @@ def create_mailbox():
mailbox_email = sanitize_email(request.get_json().get("email"))
if not user.is_premium():
return jsonify(error=f"Only premium plan can add additional mailbox"), 400
return jsonify(error="Only premium plan can add additional mailbox"), 400
if not is_valid_email(mailbox_email):
return jsonify(error=f"{mailbox_email} invalid"), 400

View File

@ -150,7 +150,7 @@ def new_custom_alias_v3():
if not data:
return jsonify(error="request body cannot be empty"), 400
if type(data) is not dict:
if not isinstance(data, dict):
return jsonify(error="request body does not follow the required format"), 400
alias_prefix = data.get("alias_prefix", "").strip().lower().replace(" ", "")
@ -168,7 +168,7 @@ def new_custom_alias_v3():
return jsonify(error="alias prefix invalid format or too long"), 400
# check if mailbox is not tempered with
if type(mailbox_ids) is not list:
if not isinstance(mailbox_ids, list):
return jsonify(error="mailbox_ids must be an array of id"), 400
mailboxes = []
for mailbox_id in mailbox_ids:

View File

@ -32,6 +32,7 @@ def user_to_dict(user: User) -> dict:
"in_trial": user.in_trial(),
"max_alias_free_plan": user.max_alias_for_free_account(),
"connected_proton_address": None,
"can_create_reverse_alias": user.can_create_contacts(),
}
if config.CONNECT_WITH_PROTON:
@ -58,6 +59,7 @@ def user_info():
- in_trial
- max_alias_free
- is_connected_with_proton
- can_create_reverse_alias
"""
user = g.user

View File

@ -16,4 +16,26 @@ from .views import (
social,
recovery,
api_to_cookie,
oidc,
)
__all__ = [
"login",
"logout",
"register",
"activate",
"resend_activation",
"reset_password",
"forgot_password",
"github",
"google",
"facebook",
"proton",
"change_email",
"mfa",
"fido",
"social",
"recovery",
"api_to_cookie",
"oidc",
]

View File

@ -3,10 +3,13 @@ from flask_login import login_user
from app.auth.base import auth_bp
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import EmailChange, ResetPasswordCode
@auth_bp.route("/change_email", methods=["GET", "POST"])
@limiter.limit("3/hour")
def change_email():
code = request.args.get("code")
@ -22,12 +25,14 @@ def change_email():
return render_template("auth/change_email.html")
user = email_change.user
old_email = user.email
user.email = email_change.new_email
EmailChange.delete(email_change.id)
ResetPasswordCode.filter_by(user_id=user.id).delete()
Session.commit()
LOG.i(f"User {user} has changed their email from {old_email} to {user.email}")
flash("Your new email has been updated", "success")
login_user(user)

View File

@ -62,7 +62,7 @@ def fido():
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
if browser and not browser.is_expired() and browser.user_id == user.id:
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
return redirect(next_url or url_for("dashboard.index"))
else:
@ -110,7 +110,7 @@ def fido():
session["sudo_time"] = int(time())
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
response = make_response(redirect(next_url or url_for("dashboard.index")))

View File

@ -3,7 +3,7 @@ from flask_wtf import FlaskForm
from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.extensions import limiter
from app.log import LOG
from app.models import User

View File

@ -7,7 +7,7 @@ from app.config import URL, GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET
from app.db import Session
from app.log import LOG
from app.models import User, File, SocialAuth
from app.utils import random_string, sanitize_email
from app.utils import random_string, sanitize_email, sanitize_next_url
from .login_utils import after_login
_authorization_base_url = "https://accounts.google.com/o/oauth2/v2/auth"
@ -29,7 +29,7 @@ def google_login():
# to avoid flask-login displaying the login error message
session.pop("_flashes", None)
next_url = request.args.get("next")
next_url = sanitize_next_url(request.args.get("next"))
# Google does not allow to append param to redirect_url
# we need to pass the next url by session

View File

@ -5,7 +5,7 @@ from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON, OIDC_CLIENT_ID
from app.events.auth_event import LoginEvent
from app.extensions import limiter
from app.log import LOG
@ -77,4 +77,6 @@ def login():
next_url=next_url,
show_resend_activation=show_resend_activation,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -55,7 +55,7 @@ def mfa():
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
if browser and not browser.is_expired() and browser.user_id == user.id:
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
return redirect(next_url or url_for("dashboard.index"))
else:
@ -73,7 +73,7 @@ def mfa():
Session.commit()
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
response = make_response(redirect(next_url or url_for("dashboard.index")))

135
app/auth/views/oidc.py Normal file
View File

@ -0,0 +1,135 @@
from flask import request, session, redirect, flash, url_for
from requests_oauthlib import OAuth2Session
import requests
from app import config
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import (
URL,
OIDC_SCOPES,
OIDC_NAME_FIELD,
)
from app.db import Session
from app.email_utils import send_welcome_email
from app.log import LOG
from app.models import User, SocialAuth
from app.utils import sanitize_email, sanitize_next_url
# need to set explicitly redirect_uri instead of leaving the lib to pre-fill redirect_uri
# when served behind nginx, the redirect_uri is localhost... and not the real url
redirect_uri = URL + "/auth/oidc/callback"
SESSION_STATE_KEY = "oauth_state"
SESSION_NEXT_KEY = "oauth_redirect_next"
@auth_bp.route("/oidc/login")
def oidc_login():
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
next_url = sanitize_next_url(request.args.get("next"))
auth_url = requests.get(config.OIDC_WELL_KNOWN_URL).json()["authorization_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID, scope=[OIDC_SCOPES], redirect_uri=redirect_uri
)
authorization_url, state = oidc.authorization_url(auth_url)
# State is used to prevent CSRF, keep this for later.
session[SESSION_STATE_KEY] = state
session[SESSION_NEXT_KEY] = next_url
return redirect(authorization_url)
@auth_bp.route("/oidc/callback")
def oidc_callback():
if SESSION_STATE_KEY not in session:
flash("Invalid state, please retry", "error")
return redirect(url_for("auth.login"))
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
# user clicks on cancel
if "error" in request.args:
flash("Please use another sign in method then", "warning")
return redirect("/")
oidc_configuration = requests.get(config.OIDC_WELL_KNOWN_URL).json()
user_info_url = oidc_configuration["userinfo_endpoint"]
token_url = oidc_configuration["token_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID,
state=session[SESSION_STATE_KEY],
scope=[OIDC_SCOPES],
redirect_uri=redirect_uri,
)
oidc.fetch_token(
token_url,
client_secret=config.OIDC_CLIENT_SECRET,
authorization_response=request.url,
)
oidc_user_data = oidc.get(user_info_url)
if oidc_user_data.status_code != 200:
LOG.e(
f"cannot get oidc user data {oidc_user_data.status_code} {oidc_user_data.text}"
)
flash(
"Cannot get user data from OIDC, please use another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
oidc_user_data = oidc_user_data.json()
email = oidc_user_data.get("email")
if not email:
LOG.e(f"cannot get email for OIDC user {oidc_user_data} {email}")
flash(
"Cannot get a valid email from OIDC, please another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
email = sanitize_email(email)
user = User.get_by(email=email)
if not user and config.DISABLE_REGISTRATION:
flash(
"Sorry you cannot sign up via the OIDC provider. Please sign-up first with your email.",
"error",
)
return redirect(url_for("auth.register"))
elif not user:
user = create_user(email, oidc_user_data)
if not SocialAuth.get_by(user_id=user.id, social="oidc"):
SocialAuth.create(user_id=user.id, social="oidc")
Session.commit()
# The activation link contains the original page, for ex authorize page
next_url = session[SESSION_NEXT_KEY]
session[SESSION_NEXT_KEY] = None
return after_login(user, next_url)
def create_user(email, oidc_user_data):
new_user = User.create(
email=email,
name=oidc_user_data.get(OIDC_NAME_FIELD),
password="",
activated=True,
)
LOG.i(f"Created new user for login request from OIDC. New user {new_user.id}")
Session.commit()
send_welcome_email(new_user)
return new_user

View File

@ -53,7 +53,7 @@ def recovery_route():
del session[MFA_USER_ID]
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
recovery_code.used = True
recovery_code.used_at = arrow.now()

View File

@ -6,7 +6,7 @@ from wtforms import StringField, validators
from app import email_utils, config
from app.auth.base import auth_bp
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON
from app.auth.views.login_utils import get_referral
from app.config import URL, HCAPTCHA_SECRET, HCAPTCHA_SITEKEY
from app.db import Session
@ -94,9 +94,7 @@ def register():
try:
send_activation_email(user, next_url)
RegisterEvent(RegisterEvent.ActionType.success).send()
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += (
1
)
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += 1
Session.commit()
except Exception:
flash("Invalid email, are you sure the email is correct?", "error")
@ -111,6 +109,8 @@ def register():
next_url=next_url,
HCAPTCHA_SITEKEY=HCAPTCHA_SITEKEY,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=config.OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -179,6 +179,7 @@ AWS_REGION = os.environ.get("AWS_REGION") or "eu-west-3"
BUCKET = os.environ.get("BUCKET")
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
AWS_ENDPOINT_URL = os.environ.get("AWS_ENDPOINT_URL", None)
# Paddle
try:
@ -233,7 +234,7 @@ else:
print("WARNING: Use a temp directory for GNUPGHOME", GNUPGHOME)
# Github, Google, Facebook client id and secrets
# Github, Google, Facebook, OIDC client id and secrets
GITHUB_CLIENT_ID = os.environ.get("GITHUB_CLIENT_ID")
GITHUB_CLIENT_SECRET = os.environ.get("GITHUB_CLIENT_SECRET")
@ -243,6 +244,13 @@ GOOGLE_CLIENT_SECRET = os.environ.get("GOOGLE_CLIENT_SECRET")
FACEBOOK_CLIENT_ID = os.environ.get("FACEBOOK_CLIENT_ID")
FACEBOOK_CLIENT_SECRET = os.environ.get("FACEBOOK_CLIENT_SECRET")
CONNECT_WITH_OIDC_ICON = os.environ.get("CONNECT_WITH_OIDC_ICON")
OIDC_WELL_KNOWN_URL = os.environ.get("OIDC_WELL_KNOWN_URL")
OIDC_CLIENT_ID = os.environ.get("OIDC_CLIENT_ID")
OIDC_CLIENT_SECRET = os.environ.get("OIDC_CLIENT_SECRET")
OIDC_SCOPES = os.environ.get("OIDC_SCOPES")
OIDC_NAME_FIELD = os.environ.get("OIDC_NAME_FIELD", "name")
PROTON_CLIENT_ID = os.environ.get("PROTON_CLIENT_ID")
PROTON_CLIENT_SECRET = os.environ.get("PROTON_CLIENT_SECRET")
PROTON_BASE_URL = os.environ.get(
@ -420,6 +428,11 @@ try:
except Exception:
HIBP_SCAN_INTERVAL_DAYS = 7
HIBP_API_KEYS = sl_getenv("HIBP_API_KEYS", list) or []
HIBP_MAX_ALIAS_CHECK = 10_000
HIBP_RPM = int(os.environ.get("HIBP_API_RPM", 100))
HIBP_SKIP_PARTNER_ALIAS = os.environ.get("HIBP_SKIP_PARTNER_ALIAS")
KEEP_OLD_DATA_DAYS = 30
POSTMASTER = os.environ.get("POSTMASTER")
@ -488,7 +501,34 @@ def setup_nameservers():
NAMESERVERS = setup_nameservers()
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = False
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = os.environ.get(
"DISABLE_CREATE_CONTACTS_FOR_FREE_USERS", False
)
# Expect format hits,seconds:hits,seconds...
# Example 1,10:4,60 means 1 in the last 10 secs or 4 in the last 60 secs
def getRateLimitFromConfig(
env_var: string, default: string = ""
) -> list[tuple[int, int]]:
value = os.environ.get(env_var, default)
if not value:
return []
entries = [entry for entry in value.split(":")]
limits = []
for entry in entries:
fields = entry.split(",")
limit = (int(fields[0]), int(fields[1]))
limits.append(limit)
return limits
ALIAS_CREATE_RATE_LIMIT_FREE = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_FREE", "10,900:50,3600"
)
ALIAS_CREATE_RATE_LIMIT_PAID = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_PAID", "50,900:200,3600"
)
PARTNER_API_TOKEN_SECRET = os.environ.get("PARTNER_API_TOKEN_SECRET") or (
FLASK_SECRET + "partnerapitoken"
)
@ -539,3 +579,11 @@ MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30))
UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None)
UPCLOUD_PASSWORD = os.environ.get("UPCLOUD_PASSWORD", None)
UPCLOUD_DB_ID = os.environ.get("UPCLOUD_DB_ID", None)
STORE_TRANSACTIONAL_EMAILS = "STORE_TRANSACTIONAL_EMAILS" in os.environ
EVENT_WEBHOOK = os.environ.get("EVENT_WEBHOOK", None)
# We want it disabled by default, so only skip if defined
EVENT_WEBHOOK_SKIP_VERIFY_SSL = "EVENT_WEBHOOK_SKIP_VERIFY_SSL" in os.environ
EVENT_WEBHOOK_DISABLE = "EVENT_WEBHOOK_DISABLE" in os.environ

View File

@ -32,4 +32,42 @@ from .views import (
delete_account,
notification,
support,
account_setting,
)
__all__ = [
"index",
"pricing",
"setting",
"custom_alias",
"subdomain",
"billing",
"alias_log",
"alias_export",
"unsubscribe",
"api_key",
"custom_domain",
"alias_contact_manager",
"enter_sudo",
"mfa_setup",
"mfa_cancel",
"fido_setup",
"coupon",
"fido_manage",
"domain_detail",
"lifetime_licence",
"directory",
"mailbox",
"mailbox_detail",
"refused_email",
"referral",
"contact_detail",
"setup_done",
"batch_import",
"alias_transfer",
"app",
"delete_account",
"notification",
"support",
"account_setting",
]

View File

@ -0,0 +1,242 @@
import arrow
from flask import (
render_template,
request,
redirect,
url_for,
flash,
)
from flask_login import login_required, current_user
from app import email_utils
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.dashboard.views.mailbox_detail import ChangeEmailForm
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.extensions import limiter
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
ResetPasswordCode,
EmailChange,
User,
Alias,
AliasGeneratorEnum,
SenderFormatEnum,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import perform_proton_account_unlink
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@dashboard_bp.route("/account_setting", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("5/minute", methods=["POST"])
def account_setting():
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
pending_email = email_change.new_email
else:
pending_email = None
if request.method == "POST":
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
partner_sub = None
partner_name = None
return render_template(
"dashboard/account_setting.html",
csrf_form=csrf_form,
PlanEnum=PlanEnum,
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
partner_sub=partner_sub,
partner_name=partner_name,
FIRST_ALIAS_DOMAIN=FIRST_ALIAS_DOMAIN,
ALIAS_RAND_SUFFIX_LENGTH=ALIAS_RANDOM_SUFFIX_LENGTH,
connect_with_proton=CONNECT_WITH_PROTON,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user.email, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(email_change.new_email, user.email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
@sudo_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
@sudo_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
@sudo_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -51,14 +51,6 @@ def email_validator():
return _check
def user_can_create_contacts(user: User) -> bool:
if user.is_premium():
return True
if user.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
"""
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
@ -82,7 +74,7 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
if contact:
raise ErrContactAlreadyExists(contact)
if not user_can_create_contacts(user):
if not user.can_create_contacts():
raise ErrContactErrorUpgradeNeeded()
contact = Contact.create(
@ -327,6 +319,6 @@ def alias_contact_manager(alias_id):
last_page=last_page,
query=query,
nb_contact=nb_contact,
can_create_contacts=user_can_create_contacts(current_user),
can_create_contacts=current_user.can_create_contacts(),
csrf_form=csrf_form,
)

View File

@ -1,9 +1,13 @@
from app.dashboard.base import dashboard_bp
from flask_login import login_required, current_user
from app.alias_utils import alias_export_csv
from app.dashboard.views.enter_sudo import sudo_required
from app.extensions import limiter
@dashboard_bp.route("/alias_export", methods=["GET"])
@login_required
@sudo_required
@limiter.limit("2/minute")
def alias_export_route():
return alias_export_csv(current_user)

View File

@ -87,6 +87,6 @@ def get_alias_log(alias: Alias, page_id=0) -> [AliasLog]:
contact=contact,
)
logs.append(al)
logs = sorted(logs, key=lambda l: l.when, reverse=True)
logs = sorted(logs, key=lambda log: log.when, reverse=True)
return logs

View File

@ -1,14 +1,9 @@
from app.db import Session
"""
List of apps that user has used via the "Sign in with SimpleLogin"
"""
from flask import render_template, request, flash, redirect
from flask_login import login_required, current_user
from sqlalchemy.orm import joinedload
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.models import (
ClientUser,
)
@ -17,6 +12,10 @@ from app.models import (
@dashboard_bp.route("/app", methods=["GET", "POST"])
@login_required
def app_route():
"""
List of apps that user has used via the "Sign in with SimpleLogin"
"""
client_users = (
ClientUser.filter_by(user_id=current_user.id)
.options(joinedload(ClientUser.client))

View File

@ -5,7 +5,9 @@ from flask_login import login_required, current_user
from app import s3
from app.config import JOB_BATCH_IMPORT
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import File, BatchImport, Job
from app.utils import random_string, CSRFValidationForm
@ -13,6 +15,8 @@ from app.utils import random_string, CSRFValidationForm
@dashboard_bp.route("/batch_import", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("10/minute", methods=["POST"])
def batch_import_route():
# only for users who have custom domains
if not current_user.verified_custom_domains():
@ -37,7 +41,7 @@ def batch_import_route():
return redirect(request.url)
if len(batch_imports) > 10:
flash(
"You have too many imports already. Wait until some get cleaned up",
"You have too many imports already. Please wait until some get cleaned up",
"error",
)
return render_template(

View File

@ -100,7 +100,7 @@ def coupon_route():
commit=True,
)
flash(
f"Your account has been upgraded to Premium, thanks for your support!",
"Your account has been upgraded to Premium, thanks for your support!",
"success",
)

View File

@ -24,6 +24,7 @@ from app.models import (
AliasMailbox,
DomainDeletedAlias,
)
from app.utils import CSRFValidationForm
@dashboard_bp.route("/custom_alias", methods=["GET", "POST"])
@ -48,9 +49,13 @@ def custom_alias():
at_least_a_premium_domain = True
break
csrf_form = CSRFValidationForm()
mailboxes = current_user.mailboxes()
if request.method == "POST":
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(request.url)
alias_prefix = request.form.get("prefix").strip().lower().replace(" ", "")
signed_alias_suffix = request.form.get("signed-alias-suffix")
mailbox_ids = request.form.getlist("mailboxes")
@ -164,4 +169,5 @@ def custom_alias():
alias_suffixes=alias_suffixes,
at_least_a_premium_domain=at_least_a_premium_domain,
mailboxes=mailboxes,
csrf_form=csrf_form,
)

View File

@ -67,7 +67,7 @@ def directory():
if request.method == "POST":
if request.form.get("form-name") == "delete":
if not delete_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_obj = Directory.get(delete_dir_form.directory_id.data)
@ -87,7 +87,7 @@ def directory():
if request.form.get("form-name") == "toggle-directory":
if not toggle_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_id = toggle_dir_form.directory_id.data
dir_obj = Directory.get(dir_id)
@ -109,7 +109,7 @@ def directory():
elif request.form.get("form-name") == "update":
if not update_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_id = update_dir_form.directory_id.data
dir_obj = Directory.get(dir_id)

View File

@ -6,15 +6,15 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from wtforms import PasswordField, validators
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, OIDC_CLIENT_ID, CONNECT_WITH_OIDC_ICON
from app.dashboard.base import dashboard_bp
from app.extensions import limiter
from app.log import LOG
from app.models import PartnerUser
from app.models import PartnerUser, SocialAuth
from app.proton.utils import get_proton_partner
from app.utils import sanitize_next_url
_SUDO_GAP = 900
_SUDO_GAP = 120
class LoginForm(FlaskForm):
@ -51,11 +51,19 @@ def enter_sudo():
if not partner_user or partner_user.partner_id != get_proton_partner().id:
proton_enabled = False
oidc_enabled = OIDC_CLIENT_ID is not None
if oidc_enabled:
oidc_enabled = (
SocialAuth.get_by(user_id=current_user.id, social="oidc") is not None
)
return render_template(
"dashboard/enter_sudo.html",
password_check_form=password_check_form,
next=request.args.get("next"),
connect_with_proton=proton_enabled,
connect_with_oidc=oidc_enabled,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -52,12 +52,13 @@ def get_stats(user: User) -> Stats:
@dashboard_bp.route("/", methods=["GET", "POST"])
@login_required
@limiter.limit(
ALIAS_LIMIT,
methods=["POST"],
exempt_when=lambda: request.form.get("form-name") != "create-random-email",
)
@login_required
@limiter.limit("10/minute", methods=["GET"], key_func=lambda: current_user.id)
@parallel_limiter.lock(
name="alias_creation",
only_when=lambda: request.form.get("form-name") == "create-random-email",
@ -140,12 +141,12 @@ def index():
)
if request.form.get("form-name") == "delete-alias":
LOG.d("delete alias %s", alias)
LOG.i(f"User {current_user} requested deletion of alias {alias}")
email = alias.email
alias_utils.delete_alias(alias, current_user)
flash(f"Alias {email} has been deleted", "success")
elif request.form.get("form-name") == "disable-alias":
alias.enabled = False
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
flash(f"Alias {alias.email} has been disabled", "success")

View File

@ -11,9 +11,11 @@ from wtforms.fields.html5 import EmailField
from app.config import ENFORCE_SPF, MAILBOX_SECRET
from app.config import URL
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.email_utils import email_can_be_used_as_mailbox
from app.email_utils import mailbox_already_used, render, send_email
from app.extensions import limiter
from app.log import LOG
from app.models import Alias, AuthorizedAddress
from app.models import Mailbox
@ -29,6 +31,8 @@ class ChangeEmailForm(FlaskForm):
@dashboard_bp.route("/mailbox/<int:mailbox_id>/", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("20/minute", methods=["POST"])
def mailbox_detail_route(mailbox_id):
mailbox: Mailbox = Mailbox.get(mailbox_id)
if not mailbox or mailbox.user_id != current_user.id:
@ -179,8 +183,15 @@ def mailbox_detail_route(mailbox_id):
elif request.form.get("form-name") == "toggle-pgp":
if request.form.get("pgp-enabled") == "on":
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "success")
if mailbox.is_proton():
mailbox.disable_pgp = True
flash(
"Enabling PGP for a Proton Mail mailbox is redundant and does not add any security benefit",
"info",
)
else:
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "info")
else:
mailbox.disable_pgp = True
flash(f"PGP is disabled on {mailbox.email}", "info")

View File

@ -13,34 +13,24 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from flask_wtf.file import FileField
from wtforms import StringField, validators
from wtforms.fields.html5 import EmailField
from app import s3, email_utils
from app import s3
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.errors import ProtonPartnerNotSetUp
from app.extensions import limiter
from app.image_validation import detect_image_format, ImageFormat
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
File,
ResetPasswordCode,
EmailChange,
User,
Alias,
CustomDomain,
AliasGeneratorEnum,
AliasSuffixEnum,
@ -53,11 +43,10 @@ from app.models import (
PartnerSubscription,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import get_proton_partner, perform_proton_account_unlink
from app.proton.utils import get_proton_partner
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@ -66,12 +55,6 @@ class SettingForm(FlaskForm):
profile_picture = FileField("Profile Picture")
class ChangeEmailForm(FlaskForm):
email = EmailField(
"email", validators=[validators.DataRequired(), validators.Email()]
)
class PromoCodeForm(FlaskForm):
code = StringField("Name", validators=[validators.DataRequired()])
@ -109,7 +92,6 @@ def get_partner_subscription_and_name(
def setting():
form = SettingForm()
promo_form = PromoCodeForm()
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
@ -122,64 +104,7 @@ def setting():
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-profile":
if form.validate():
profile_updated = False
@ -223,15 +148,6 @@ def setting():
if profile_updated:
flash("Your profile has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "notification-preference":
choose = request.form.get("notification")
if choose == "on":
@ -241,7 +157,6 @@ def setting():
Session.commit()
flash("Your notification preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-alias-generator":
scheme = int(request.form.get("alias-generator-scheme"))
if AliasGeneratorEnum.has_value(scheme):
@ -249,7 +164,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-random-alias-default-domain":
default_domain = request.form.get("random-alias-default-domain")
@ -288,7 +202,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "random-alias-suffix":
scheme = int(request.form.get("random-alias-suffix-generator"))
if AliasSuffixEnum.has_value(scheme):
@ -296,7 +209,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-sender-format":
sender_format = int(request.form.get("sender-format"))
if SenderFormatEnum.has_value(sender_format):
@ -306,7 +218,6 @@ def setting():
flash("Your sender format preference has been updated", "success")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "replace-ra":
choose = request.form.get("replace-ra")
if choose == "on":
@ -316,7 +227,21 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "enable_data_breach_check":
if not current_user.is_premium():
flash("Only premium plan can enable data breach monitoring", "warning")
return redirect(url_for("dashboard.setting"))
choose = request.form.get("enable_data_breach_check")
if choose == "on":
LOG.i("User {current_user} has enabled data breach monitoring")
current_user.enable_data_breach_check = True
flash("Data breach monitoring is enabled", "success")
else:
LOG.i("User {current_user} has disabled data breach monitoring")
current_user.enable_data_breach_check = False
flash("Data breach monitoring is disabled", "info")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "sender-in-ra":
choose = request.form.get("enable")
if choose == "on":
@ -326,7 +251,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "expand-alias-info":
choose = request.form.get("enable")
if choose == "on":
@ -388,14 +312,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
manual_sub = ManualSubscription.get_by(user_id=current_user.id)
apple_sub = AppleSubscription.get_by(user_id=current_user.id)
@ -418,7 +334,6 @@ def setting():
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
promo_form=promo_form,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
@ -433,85 +348,3 @@ def setting():
connect_with_proton=CONNECT_WITH_PROTON,
proton_linked_account=proton_linked_account,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user.email, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(email_change.new_email, user.email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -8,6 +8,7 @@ from app.db import Session
from flask import redirect, url_for, flash, request, render_template
from flask_login import login_required, current_user
from app import alias_utils
from app.dashboard.base import dashboard_bp
from app.handler.unsubscribe_encoder import UnsubscribeAction
from app.handler.unsubscribe_handler import UnsubscribeHandler
@ -31,7 +32,7 @@ def unsubscribe(alias_id):
# automatic unsubscribe, according to https://tools.ietf.org/html/rfc8058
if request.method == "POST":
alias.enabled = False
alias_utils.change_alias_status(alias, False)
flash(f"Alias {alias.email} has been blocked", "success")
Session.commit()
@ -75,12 +76,11 @@ def block_contact(contact_id):
@dashboard_bp.route("/unsubscribe/encoded/<encoded_request>", methods=["GET"])
@login_required
def encoded_unsubscribe(encoded_request: str):
unsub_data = UnsubscribeHandler().handle_unsubscribe_from_request(
current_user, encoded_request
)
if not unsub_data:
flash(f"Invalid unsubscribe request", "error")
flash("Invalid unsubscribe request", "error")
return redirect(url_for("dashboard.index"))
if unsub_data.action == UnsubscribeAction.DisableAlias:
alias = Alias.get(unsub_data.data)
@ -97,14 +97,14 @@ def encoded_unsubscribe(encoded_request: str):
)
)
if unsub_data.action == UnsubscribeAction.UnsubscribeNewsletter:
flash(f"You've unsubscribed from the newsletter", "success")
flash("You've unsubscribed from the newsletter", "success")
return redirect(
url_for(
"dashboard.index",
)
)
if unsub_data.action == UnsubscribeAction.OriginalUnsubscribeMailto:
flash(f"The original unsubscribe request has been forwarded", "success")
flash("The original unsubscribe request has been forwarded", "success")
return redirect(
url_for(
"dashboard.index",

View File

@ -1 +1,3 @@
from .views import index, new_client, client_detail
__all__ = ["index", "new_client", "client_detail"]

View File

@ -87,7 +87,7 @@ def client_detail(client_id):
)
flash(
f"Thanks for submitting, we are informed and will come back to you asap!",
"Thanks for submitting, we are informed and will come back to you asap!",
"success",
)

View File

@ -1 +1,3 @@
from .views import index
__all__ = ["index"]

View File

@ -21,6 +21,7 @@ LIST_UNSUBSCRIBE = "List-Unsubscribe"
LIST_UNSUBSCRIBE_POST = "List-Unsubscribe-Post"
RETURN_PATH = "Return-Path"
AUTHENTICATION_RESULTS = "Authentication-Results"
SL_QUEUE_ID = "X-SL-Queue-Id"
# headers used to DKIM sign in order of preference
DKIM_HEADERS = [

View File

@ -93,7 +93,7 @@ def send_welcome_email(user):
send_email(
comm_email,
f"Welcome to SimpleLogin",
"Welcome to SimpleLogin",
render("com/welcome.txt", user=user, alias=alias),
render("com/welcome.html", user=user, alias=alias),
unsubscribe_link,
@ -104,7 +104,7 @@ def send_welcome_email(user):
def send_trial_end_soon_email(user):
send_email(
user.email,
f"Your trial will end soon",
"Your trial will end soon",
render("transactional/trial-end.txt.jinja2", user=user),
render("transactional/trial-end.html", user=user),
ignore_smtp_error=True,
@ -114,7 +114,7 @@ def send_trial_end_soon_email(user):
def send_activation_email(email, activation_link):
send_email(
email,
f"Just one more step to join SimpleLogin",
"Just one more step to join SimpleLogin",
render(
"transactional/activation.txt",
activation_link=activation_link,
@ -494,9 +494,10 @@ def delete_header(msg: Message, header: str):
def sanitize_header(msg: Message, header: str):
"""remove trailing space and remove linebreak from a header"""
header_lowercase = header.lower()
for i in reversed(range(len(msg._headers))):
header_name = msg._headers[i][0].lower()
if header_name == header.lower():
if header_name == header_lowercase:
# msg._headers[i] is a tuple like ('From', 'hey@google.com')
if msg._headers[i][1]:
msg._headers[i] = (
@ -583,6 +584,26 @@ def email_can_be_used_as_mailbox(email_address: str) -> bool:
LOG.d("MX Domain %s %s is invalid mailbox domain", mx_domain, domain)
return False
existing_user = User.get_by(email=email_address)
if existing_user and existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled. {email_address} cannot be used for other mailbox"
)
return False
for existing_user in (
User.query()
.join(Mailbox, User.id == Mailbox.user_id)
.filter(Mailbox.email == email_address)
.group_by(User.id)
.all()
):
if existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled and has a mailbox with {email_address}. Id cannot be used for other mailbox"
)
return False
return True
@ -768,7 +789,7 @@ def get_header_unicode(header: Union[str, Header]) -> str:
ret = ""
for to_decoded_str, charset in decode_header(header):
if charset is None:
if type(to_decoded_str) is bytes:
if isinstance(to_decoded_str, bytes):
decoded_str = to_decoded_str.decode()
else:
decoded_str = to_decoded_str
@ -805,13 +826,13 @@ def to_bytes(msg: Message):
for generator_policy in [None, policy.SMTP, policy.SMTPUTF8]:
try:
return msg.as_bytes(policy=generator_policy)
except:
except Exception:
LOG.w("as_bytes() fails with %s policy", policy, exc_info=True)
msg_string = msg.as_string()
try:
return msg_string.encode()
except:
except Exception:
LOG.w("as_string().encode() fails", exc_info=True)
return msg_string.encode(errors="replace")
@ -906,7 +927,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
if content_type == "text/plain":
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
clone_msg = copy(msg)
new_payload = f"""{text_header}
------------------------------
@ -916,7 +937,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
elif content_type == "text/html":
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
new_payload = f"""<table width="100%" style="width: 100%; -premailer-width: 100%; -premailer-cellpadding: 0;
-premailer-cellspacing: 0; margin: 0; padding: 0;">
<tr>
@ -972,7 +993,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
if type(msg) is str:
if isinstance(msg, str):
msg = msg.replace(old, new)
return msg
@ -995,7 +1016,7 @@ def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
if content_type in ("text/plain", "text/html"):
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
if encoding == EmailEncoding.QUOTED:
LOG.d("handle quoted-printable replace %s -> %s", old, new)
# first decode the payload
@ -1383,7 +1404,7 @@ def generate_verp_email(
# Time is in minutes granularity and start counting on 2022-01-01 to reduce bytes to represent time
data = [
verp_type.value,
object_id,
object_id or 0,
int((time.time() - VERP_TIME_START) / 60),
]
json_payload = json.dumps(data).encode("utf-8")

0
app/events/__init__.py Normal file
View File

View File

@ -0,0 +1,66 @@
from abc import ABC, abstractmethod
from app import config
from app.db import Session
from app.errors import ProtonPartnerNotSetUp
from app.events.generated import event_pb2
from app.models import User, PartnerUser, SyncEvent
from app.proton.utils import get_proton_partner
from typing import Optional
NOTIFICATION_CHANNEL = "simplelogin_sync_events"
class Dispatcher(ABC):
@abstractmethod
def send(self, event: bytes):
pass
class PostgresDispatcher(Dispatcher):
def send(self, event: bytes):
instance = SyncEvent.create(content=event, flush=True)
Session.execute(f"NOTIFY {NOTIFICATION_CHANNEL}, '{instance.id}';")
@staticmethod
def get():
return PostgresDispatcher()
class EventDispatcher:
@staticmethod
def send_event(
user: User,
content: event_pb2.EventContent,
dispatcher: Dispatcher = PostgresDispatcher.get(),
skip_if_webhook_missing: bool = True,
):
if config.EVENT_WEBHOOK_DISABLE:
return
if not config.EVENT_WEBHOOK and skip_if_webhook_missing:
return
partner_user = EventDispatcher.__partner_user(user.id)
if not partner_user:
return
event = event_pb2.Event(
user_id=user.id,
external_user_id=partner_user.external_user_id,
partner_id=partner_user.partner_id,
content=content,
)
serialized = event.SerializeToString()
dispatcher.send(serialized)
@staticmethod
def __partner_user(user_id: int) -> Optional[PartnerUser]:
# Check if the current user has a partner_id
try:
proton_partner_id = get_proton_partner().id
except ProtonPartnerNotSetUp:
return None
# It has. Retrieve the information for the PartnerUser
return PartnerUser.get_by(user_id=user_id, partner_id=proton_partner_id)

View File

@ -0,0 +1,38 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: event.proto
# Protobuf Python Version: 5.26.1
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x0b\x65vent.proto\x12\x12simplelogin_events\"\'\n\x0eUserPlanChange\x12\x15\n\rplan_end_time\x18\x01 \x01(\r\"\r\n\x0bUserDeleted\"Z\n\x0c\x41liasCreated\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x12\n\nalias_note\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\"K\n\x11\x41liasStatusChange\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x03 \x01(\x08\"5\n\x0c\x41liasDeleted\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\"\xce\x02\n\x0c\x45ventContent\x12>\n\x10user_plan_change\x18\x01 \x01(\x0b\x32\".simplelogin_events.UserPlanChangeH\x00\x12\x37\n\x0cuser_deleted\x18\x02 \x01(\x0b\x32\x1f.simplelogin_events.UserDeletedH\x00\x12\x39\n\ralias_created\x18\x03 \x01(\x0b\x32 .simplelogin_events.AliasCreatedH\x00\x12\x44\n\x13\x61lias_status_change\x18\x04 \x01(\x0b\x32%.simplelogin_events.AliasStatusChangeH\x00\x12\x39\n\ralias_deleted\x18\x05 \x01(\x0b\x32 .simplelogin_events.AliasDeletedH\x00\x42\t\n\x07\x63ontent\"y\n\x05\x45vent\x12\x0f\n\x07user_id\x18\x01 \x01(\r\x12\x18\n\x10\x65xternal_user_id\x18\x02 \x01(\t\x12\x12\n\npartner_id\x18\x03 \x01(\r\x12\x31\n\x07\x63ontent\x18\x04 \x01(\x0b\x32 .simplelogin_events.EventContentb\x06proto3')
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'event_pb2', _globals)
if not _descriptor._USE_C_DESCRIPTORS:
DESCRIPTOR._loaded_options = None
_globals['_USERPLANCHANGE']._serialized_start=35
_globals['_USERPLANCHANGE']._serialized_end=74
_globals['_USERDELETED']._serialized_start=76
_globals['_USERDELETED']._serialized_end=89
_globals['_ALIASCREATED']._serialized_start=91
_globals['_ALIASCREATED']._serialized_end=181
_globals['_ALIASSTATUSCHANGE']._serialized_start=183
_globals['_ALIASSTATUSCHANGE']._serialized_end=258
_globals['_ALIASDELETED']._serialized_start=260
_globals['_ALIASDELETED']._serialized_end=313
_globals['_EVENTCONTENT']._serialized_start=316
_globals['_EVENTCONTENT']._serialized_end=650
_globals['_EVENT']._serialized_start=652
_globals['_EVENT']._serialized_end=773
# @@protoc_insertion_point(module_scope)

View File

@ -0,0 +1,71 @@
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from typing import ClassVar as _ClassVar, Mapping as _Mapping, Optional as _Optional, Union as _Union
DESCRIPTOR: _descriptor.FileDescriptor
class UserPlanChange(_message.Message):
__slots__ = ("plan_end_time",)
PLAN_END_TIME_FIELD_NUMBER: _ClassVar[int]
plan_end_time: int
def __init__(self, plan_end_time: _Optional[int] = ...) -> None: ...
class UserDeleted(_message.Message):
__slots__ = ()
def __init__(self) -> None: ...
class AliasCreated(_message.Message):
__slots__ = ("alias_id", "alias_email", "alias_note", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ALIAS_NOTE_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
alias_note: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., alias_note: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasStatusChange(_message.Message):
__slots__ = ("alias_id", "alias_email", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasDeleted(_message.Message):
__slots__ = ("alias_id", "alias_email")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ...) -> None: ...
class EventContent(_message.Message):
__slots__ = ("user_plan_change", "user_deleted", "alias_created", "alias_status_change", "alias_deleted")
USER_PLAN_CHANGE_FIELD_NUMBER: _ClassVar[int]
USER_DELETED_FIELD_NUMBER: _ClassVar[int]
ALIAS_CREATED_FIELD_NUMBER: _ClassVar[int]
ALIAS_STATUS_CHANGE_FIELD_NUMBER: _ClassVar[int]
ALIAS_DELETED_FIELD_NUMBER: _ClassVar[int]
user_plan_change: UserPlanChange
user_deleted: UserDeleted
alias_created: AliasCreated
alias_status_change: AliasStatusChange
alias_deleted: AliasDeleted
def __init__(self, user_plan_change: _Optional[_Union[UserPlanChange, _Mapping]] = ..., user_deleted: _Optional[_Union[UserDeleted, _Mapping]] = ..., alias_created: _Optional[_Union[AliasCreated, _Mapping]] = ..., alias_status_change: _Optional[_Union[AliasStatusChange, _Mapping]] = ..., alias_deleted: _Optional[_Union[AliasDeleted, _Mapping]] = ...) -> None: ...
class Event(_message.Message):
__slots__ = ("user_id", "external_user_id", "partner_id", "content")
USER_ID_FIELD_NUMBER: _ClassVar[int]
EXTERNAL_USER_ID_FIELD_NUMBER: _ClassVar[int]
PARTNER_ID_FIELD_NUMBER: _ClassVar[int]
CONTENT_FIELD_NUMBER: _ClassVar[int]
user_id: int
external_user_id: str
partner_id: int
content: EventContent
def __init__(self, user_id: _Optional[int] = ..., external_user_id: _Optional[str] = ..., partner_id: _Optional[int] = ..., content: _Optional[_Union[EventContent, _Mapping]] = ...) -> None: ...

View File

@ -30,14 +30,16 @@ def apply_dmarc_policy_for_forward_phase(
) -> Tuple[Message, Optional[str]]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.forward)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return msg, None
LOG.i(f"Spam check result in {spam_result}")
from_header = get_header_unicode(msg[headers.FROM])
warning_plain_text = f"""This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
warning_plain_text = """This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
More info on https://simplelogin.io/docs/getting-started/anti-phishing/
"""
warning_html = f"""
warning_html = """
<p style="color:red">
This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
More info on <a href="https://simplelogin.io/docs/getting-started/anti-phishing/">anti-phishing measure</a>
@ -131,7 +133,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
refused_email = RefusedEmail.create(
full_report_path=s3_report_path, user_id=alias.user_id, flush=True
)
return EmailLog.create(
email_log = EmailLog.create(
user_id=alias.user_id,
mailbox_id=alias.mailbox_id,
contact_id=contact.id,
@ -142,6 +144,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
blocked=True,
commit=True,
)
return email_log
def apply_dmarc_policy_for_reply_phase(
@ -149,8 +152,10 @@ def apply_dmarc_policy_for_reply_phase(
) -> Optional[str]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.reply)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return None
LOG.i(f"Spam check result is {spam_result}")
if spam_result.dmarc not in (
DmarcCheckResult.quarantine,
DmarcCheckResult.reject,

View File

@ -221,7 +221,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
return True
if is_deleted_alias(msg_info.sender_address):
LOG.i(f"Complaint is for deleted alias. Do nothing")
LOG.i("Complaint is for deleted alias. Do nothing")
return True
contact = Contact.get_by(reply_email=msg_info.sender_address)
@ -231,7 +231,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
alias = find_alias_with_address(msg_info.rcpt_address)
if is_deleted_alias(msg_info.rcpt_address):
LOG.i(f"Complaint is for deleted alias. Do nothing")
LOG.i("Complaint is for deleted alias. Do nothing")
return True
if not alias:

View File

@ -54,9 +54,8 @@ class UnsubscribeEncoder:
def encode_subject(
cls, action: UnsubscribeAction, data: Union[int, UnsubscribeOriginalData]
) -> str:
if (
action != UnsubscribeAction.OriginalUnsubscribeMailto
and type(data) is not int
if action != UnsubscribeAction.OriginalUnsubscribeMailto and not isinstance(
data, int
):
raise ValueError(f"Data has to be an int for an action of type {action}")
if action == UnsubscribeAction.OriginalUnsubscribeMailto:

View File

@ -3,6 +3,7 @@ from email.header import Header
from email.message import Message
from app.email import headers
from app import config
from app.email_utils import add_or_replace_header, delete_header
from app.handler.unsubscribe_encoder import (
UnsubscribeEncoder,
@ -47,6 +48,11 @@ class UnsubscribeGenerator:
method = raw_method[start + 1 : end]
url_data = urllib.parse.urlparse(method)
if url_data.scheme == "mailto":
if url_data.path == config.UNSUBSCRIBER:
LOG.debug(
f"Skipping replacing unsubscribe since the original email already points to {config.UNSUBSCRIBER}"
)
return message
query_data = urllib.parse.parse_qs(url_data.query)
mailto_unsubs = (url_data.path, query_data.get("subject", [""])[0])
LOG.debug(f"Unsub is mailto to {mailto_unsubs}")

View File

@ -5,6 +5,7 @@ from typing import Optional
from aiosmtpd.smtp import Envelope
from app import config
from app import alias_utils
from app.db import Session
from app.email import headers, status
from app.email_utils import (
@ -101,7 +102,7 @@ class UnsubscribeHandler:
mailbox.email, alias
):
return status.E509
alias.enabled = False
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
enable_alias_url = config.URL + f"/dashboard/?highlight_alias_id={alias.id}"
for mailbox in alias.mailboxes:

View File

@ -30,7 +30,10 @@ def handle_batch_import(batch_import: BatchImport):
LOG.d("Download file %s from %s", batch_import.file, file_url)
r = requests.get(file_url)
lines = [line.decode() for line in r.iter_lines()]
# Replace invisible character
lines = [
line.decode("utf-8").replace("\ufeff", "").strip() for line in r.iter_lines()
]
import_from_csv(batch_import, user, lines)

View File

@ -1,2 +1,4 @@
from .integrations import set_enable_proton_cookie
from .exit_sudo import exit_sudo_mode
__all__ = ["set_enable_proton_cookie", "exit_sudo_mode"]

View File

@ -39,7 +39,6 @@ from app.models import (
class ExportUserDataJob:
REMOVE_FIELDS = {
"User": ("otp_secret", "password"),
"Alias": ("ts_vector", "transfer_token", "hibp_last_check"),

View File

@ -22,7 +22,6 @@ from app.message_utils import message_to_bytes, message_format_base64_parts
@dataclass
class SendRequest:
SAVE_EXTENSION = "sendrequest"
envelope_from: str

View File

@ -27,7 +27,7 @@ from sqlalchemy.orm import deferred
from sqlalchemy.sql import and_
from sqlalchemy_utils import ArrowType
from app import config
from app import config, rate_limiter
from app import s3
from app.db import Session
from app.dns_utils import get_mx_domains
@ -235,6 +235,7 @@ class AuditLogActionEnum(EnumE):
download_provider_complaint = 8
disable_user = 9
enable_user = 10
stop_trial = 11
class Phase(EnumE):
@ -524,6 +525,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
sa.Boolean, default=True, nullable=False, server_default="1"
)
# user opted in for data breach check
enable_data_breach_check = sa.Column(
sa.Boolean, default=False, nullable=False, server_default="0"
)
# bitwise flags. Allow for future expansion
flags = sa.Column(
sa.BigInteger,
@ -651,6 +657,21 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return user
@classmethod
def delete(cls, obj_id, commit=False):
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import UserDeleted, EventContent
user: User = cls.get(obj_id)
EventDispatcher.send_event(user, EventContent(user_deleted=UserDeleted()))
res = super(User, cls).delete(obj_id)
if commit:
Session.commit()
return res
def get_active_subscription(
self, include_partner_subscription: bool = True
) -> Optional[
@ -726,6 +747,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return True
def is_active(self) -> bool:
if self.delete_on is None:
return True
return self.delete_on < arrow.now()
def in_trial(self):
"""return True if user does not have lifetime licence or an active subscription AND is in trial period"""
if self.lifetime_or_active_subscription():
@ -827,6 +853,9 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
Whether user can create a new alias. User can't create a new alias if
- has more than 15 aliases in the free plan, *even in the free trial*
"""
if not self.is_active():
return False
if self.disabled:
return False
@ -907,7 +936,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return sub
def verified_custom_domains(self) -> List["CustomDomain"]:
return CustomDomain.filter_by(user_id=self.id, ownership_verified=True).all()
return (
CustomDomain.filter_by(user_id=self.id, ownership_verified=True)
.order_by(CustomDomain.domain.asc())
.all()
)
def mailboxes(self) -> List["Mailbox"]:
"""list of mailbox that user own"""
@ -1113,6 +1146,13 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return random_words(1)
def can_create_contacts(self) -> bool:
if self.is_premium():
return True
if self.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def __repr__(self):
return f"<User {self.id} {self.name} {self.email}>"
@ -1402,6 +1442,9 @@ def generate_random_alias_email(
class Alias(Base, ModelMixin):
__tablename__ = "alias"
FLAG_PARTNER_CREATED = 1 << 0
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
@ -1411,6 +1454,9 @@ class Alias(Base, ModelMixin):
name = sa.Column(sa.String(128), nullable=True, default=None)
enabled = sa.Column(sa.Boolean(), default=True, nullable=False)
flags = sa.Column(
sa.BigInteger(), default=0, server_default="0", nullable=False, index=True
)
custom_domain_id = sa.Column(
sa.ForeignKey("custom_domain.id", ondelete="cascade"), nullable=True, index=True
@ -1488,6 +1534,8 @@ class Alias(Base, ModelMixin):
TSVector(), sa.Computed("to_tsvector('english', note)", persisted=True)
)
last_email_log_id = sa.Column(sa.Integer, default=None, nullable=True)
__table_args__ = (
Index("ix_video___ts_vector__", ts_vector, postgresql_using="gin"),
# index on note column using pg_trgm
@ -1506,7 +1554,8 @@ class Alias(Base, ModelMixin):
def mailboxes(self):
ret = [self.mailbox]
for m in self._mailboxes:
ret.append(m)
if m.id is not self.mailbox.id:
ret.append(m)
ret = [mb for mb in ret if mb.verified]
ret = sorted(ret, key=lambda mb: mb.email)
@ -1555,6 +1604,15 @@ class Alias(Base, ModelMixin):
flush = kw.pop("flush", False)
new_alias = cls(**kw)
user = User.get(new_alias.user_id)
if user.is_premium():
limits = config.ALIAS_CREATE_RATE_LIMIT_PAID
else:
limits = config.ALIAS_CREATE_RATE_LIMIT_FREE
# limits is array of (hits,days)
for limit in limits:
key = f"alias_create_{limit[1]}d:{user.id}"
rate_limiter.check_bucket_limit(key, limit[0], limit[1])
email = kw["email"]
# make sure email is lowercase and doesn't have any whitespace
@ -1576,6 +1634,18 @@ class Alias(Base, ModelMixin):
Session.add(new_alias)
DailyMetric.get_or_create_today_metric().nb_alias += 1
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasCreated, EventContent
event = AliasCreated(
alias_id=new_alias.id,
alias_email=new_alias.email,
alias_note=new_alias.note,
enabled=True,
)
EventDispatcher.send_event(user, EventContent(alias_created=event))
if commit:
Session.commit()
@ -2037,6 +2107,20 @@ class EmailLog(Base, ModelMixin):
def get_dashboard_url(self):
return f"{config.URL}/dashboard/refused_email?highlight_id={self.id}"
@classmethod
def create(cls, *args, **kwargs):
commit = kwargs.pop("commit", False)
email_log = super().create(*args, **kwargs)
Session.flush()
if "alias_id" in kwargs:
sql = "UPDATE alias SET last_email_log_id = :el_id WHERE id = :alias_id"
Session.execute(
sql, {"el_id": email_log.id, "alias_id": kwargs["alias_id"]}
)
if commit:
Session.commit()
return email_log
def __repr__(self):
return f"<EmailLog {self.id}>"
@ -2540,10 +2624,13 @@ class Job(Base, ModelMixin):
nullable=False,
server_default=str(JobState.ready.value),
default=JobState.ready.value,
index=True,
)
attempts = sa.Column(sa.Integer, nullable=False, server_default="0", default=0)
taken_at = sa.Column(ArrowType, nullable=True)
__table_args__ = (Index("ix_state_run_at_taken_at", state, run_at, taken_at),)
def __repr__(self):
return f"<Job {self.id} {self.name} {self.payload}>"
@ -2589,10 +2676,15 @@ class Mailbox(Base, ModelMixin):
return False
def nb_alias(self):
return (
AliasMailbox.filter_by(mailbox_id=self.id).count()
+ Alias.filter_by(mailbox_id=self.id).count()
alias_ids = set(
am.alias_id
for am in AliasMailbox.filter_by(mailbox_id=self.id).values(
AliasMailbox.alias_id
)
)
for alias in Alias.filter_by(mailbox_id=self.id).values(Alias.id):
alias_ids.add(alias.id)
return len(alias_ids)
def is_proton(self) -> bool:
if (
@ -2641,12 +2733,15 @@ class Mailbox(Base, ModelMixin):
@property
def aliases(self) -> [Alias]:
ret = Alias.filter_by(mailbox_id=self.id).all()
ret = dict(
(alias.id, alias) for alias in Alias.filter_by(mailbox_id=self.id).all()
)
for am in AliasMailbox.filter_by(mailbox_id=self.id):
ret.append(am.alias)
if am.alias_id not in ret:
ret[am.alias_id] = am.alias
return ret
return list(ret.values())
@classmethod
def create(cls, **kw):
@ -2891,7 +2986,9 @@ class RecoveryCode(Base, ModelMixin):
class Notification(Base, ModelMixin):
__tablename__ = "notification"
user_id = sa.Column(sa.ForeignKey(User.id, ondelete="cascade"), nullable=False)
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
message = sa.Column(sa.Text, nullable=False)
title = sa.Column(sa.String(512))
@ -3132,6 +3229,20 @@ class TransactionalEmail(Base, ModelMixin):
__table_args__ = (sa.Index("ix_transactional_email_created_at", "created_at"),)
@classmethod
def create(cls, **kw):
# whether to call Session.commit
commit = kw.pop("commit", False)
r = cls(**kw)
if not config.STORE_TRANSACTIONAL_EMAILS:
return r
Session.add(r)
if commit:
Session.commit()
return r
class Payout(Base, ModelMixin):
"""Referral payouts"""
@ -3322,6 +3433,15 @@ class AdminAuditLog(Base):
},
)
@classmethod
def stop_trial(cls, admin_user_id: int, user_id: int):
cls.create(
admin_user_id=admin_user_id,
action=AuditLogActionEnum.stop_trial.value,
model="User",
model_id=user_id,
)
@classmethod
def disable_otp_fido(
cls, admin_user_id: int, user_id: int, had_otp: bool, had_fido: bool
@ -3517,7 +3637,7 @@ class PartnerSubscription(Base, ModelMixin):
class Newsletter(Base, ModelMixin):
__tablename__ = "newsletter"
subject = sa.Column(sa.String(), nullable=False, unique=True, index=True)
subject = sa.Column(sa.String(), nullable=False, index=True)
html = sa.Column(sa.Text)
plain_text = sa.Column(sa.Text)
@ -3555,3 +3675,52 @@ class ApiToCookieToken(Base, ModelMixin):
code = secrets.token_urlsafe(32)
return super().create(code=code, **kwargs)
class SyncEvent(Base, ModelMixin):
"""This model holds the events that need to be sent to the webhook"""
__tablename__ = "sync_event"
content = sa.Column(sa.LargeBinary, unique=False, nullable=False)
taken_time = sa.Column(
ArrowType, default=None, nullable=True, server_default=None, index=True
)
__table_args__ = (
sa.Index("ix_sync_event_created_at", "created_at"),
sa.Index("ix_sync_event_taken_time", "taken_time"),
)
def mark_as_taken(self) -> bool:
sql = """
UPDATE sync_event
SET taken_time = :taken_time
WHERE id = :sync_event_id
AND taken_time IS NULL
"""
args = {"taken_time": arrow.now().datetime, "sync_event_id": self.id}
res = Session.execute(sql, args)
Session.commit()
return res.rowcount > 0
@classmethod
def get_dead_letter(cls, older_than: Arrow) -> [SyncEvent]:
return (
SyncEvent.filter(
(
(
SyncEvent.taken_time.isnot(None)
& (SyncEvent.taken_time < older_than)
)
| (
SyncEvent.taken_time.is_(None)
& (SyncEvent.created_at < older_than)
)
)
)
.order_by(SyncEvent.id)
.limit(100)
.all()
)

View File

@ -1 +1,3 @@
from . import views
__all__ = ["views"]

View File

@ -1 +1,3 @@
from .views import authorize, token, user_info
__all__ = ["authorize", "token", "user_info"]

View File

@ -140,7 +140,7 @@ def authorize():
Scope=Scope,
)
else: # POST - user allows or denies
if not current_user.is_authenticated or not current_user.is_active:
if not current_user.is_authenticated or not current_user.is_active():
LOG.i(
"Attempt to validate a OAUth allow request by an unauthenticated user"
)

View File

@ -64,7 +64,7 @@ def _split_arg(arg_input: Union[str, list]) -> Set[str]:
- the response_type/scope passed as a list ?scope=scope_1&scope=scope_2
"""
res = set()
if type(arg_input) is str:
if isinstance(arg_input, str):
if " " in arg_input:
for x in arg_input.split(" "):
if x:

View File

@ -5,3 +5,11 @@ from .views import (
account_activated,
extension_redirect,
)
__all__ = [
"index",
"final",
"setup_done",
"account_activated",
"extension_redirect",
]

View File

@ -39,7 +39,6 @@ class _InnerLock:
lock_redis.storage.delete(lock_name)
def __call__(self, f: Callable[..., Any]):
if self.lock_suffix is None:
lock_suffix = f.__name__
else:

View File

@ -5,3 +5,11 @@ from .views import (
provider1_callback,
provider2_callback,
)
__all__ = [
"index",
"phone_reservation",
"twilio_callback",
"provider1_callback",
"provider2_callback",
]

42
app/rate_limiter.py Normal file
View File

@ -0,0 +1,42 @@
from datetime import datetime
from typing import Optional
import newrelic.agent
import redis.exceptions
import werkzeug.exceptions
from limits.storage import RedisStorage
from app.log import LOG
lock_redis: Optional[RedisStorage] = None
def set_redis_concurrent_lock(redis: RedisStorage):
global lock_redis
lock_redis = redis
def check_bucket_limit(
lock_name: Optional[str] = None,
max_hits: int = 5,
bucket_seconds: int = 3600,
):
# Calculate current bucket time
int_time = int(datetime.utcnow().timestamp())
bucket_id = int_time - (int_time % bucket_seconds)
bucket_lock_name = f"bl:{lock_name}:{bucket_id}"
if not lock_redis:
return
try:
value = lock_redis.incr(bucket_lock_name, bucket_seconds)
if value > max_hits:
LOG.i(
f"Rate limit hit for {lock_name} (bucket id {bucket_id}) -> {value}/{max_hits}"
)
newrelic.agent.record_custom_event(
"BucketRateLimit",
{"lock_name": lock_name, "bucket_seconds": bucket_seconds},
)
raise werkzeug.exceptions.TooManyRequests()
except (redis.exceptions.RedisError, AttributeError):
LOG.e("Cannot connect to redis")

View File

@ -2,21 +2,23 @@ import flask
import limits.storage
from app.parallel_limiter import set_redis_concurrent_lock
from app.rate_limiter import set_redis_concurrent_lock as rate_limit_set_redis
from app.session import RedisSessionStore
def initialize_redis_services(app: flask.Flask, redis_url: str):
if redis_url.startswith("redis://") or redis_url.startswith("rediss://"):
storage = limits.storage.RedisStorage(redis_url)
app.session_interface = RedisSessionStore(storage.storage, storage.storage, app)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
elif redis_url.startswith("redis+sentinel://"):
storage = limits.storage.RedisSentinelStorage(redis_url)
app.session_interface = RedisSessionStore(
storage.storage, storage.storage_slave, app
)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
else:
raise RuntimeError(
f"Tried to set_redis_session with an invalid redis url: ${redis_url}"

View File

@ -5,36 +5,39 @@ from typing import Optional
import boto3
import requests
from app.config import (
AWS_REGION,
BUCKET,
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
LOCAL_FILE_UPLOAD,
UPLOAD_DIR,
URL,
)
from app import config
from app.log import LOG
if not LOCAL_FILE_UPLOAD:
_session = boto3.Session(
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
region_name=AWS_REGION,
)
_s3_client = None
def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
def _get_s3client():
global _s3_client
if _s3_client is None:
args = {
"aws_access_key_id": config.AWS_ACCESS_KEY_ID,
"aws_secret_access_key": config.AWS_SECRET_ACCESS_KEY,
"region_name": config.AWS_REGION,
}
if config.AWS_ENDPOINT_URL:
args["endpoint_url"] = config.AWS_ENDPOINT_URL
_s3_client = boto3.client("s3", **args)
return _s3_client
def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-stream"):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, key)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, key)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
f.write(bs.read())
else:
_session.resource("s3").Bucket(BUCKET).put_object(
_get_s3client().put_object(
Bucket=config.BUCKET,
Key=key,
Body=bs,
ContentType=content_type,
@ -44,15 +47,16 @@ def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
f.write(bs.read())
else:
_session.resource("s3").Bucket(BUCKET).put_object(
_get_s3client().put_object(
Bucket=config.BUCKET,
Key=path,
Body=bs,
# Support saving a remote file using Http header
@ -63,16 +67,13 @@ def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
def download_email(path: str) -> Optional[str]:
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
with open(file_path, "rb") as f:
return f.read()
resp = (
_session.resource("s3")
.Bucket(BUCKET)
.get_object(
Key=path,
)
resp = _get_s3client().get_object(
Bucket=config.BUCKET,
Key=path,
)
if not resp or "Body" not in resp:
return None
@ -85,20 +86,30 @@ def upload_from_url(url: str, upload_path):
def get_url(key: str, expires_in=3600) -> str:
if LOCAL_FILE_UPLOAD:
return URL + "/static/upload/" + key
if config.LOCAL_FILE_UPLOAD:
return config.URL + "/static/upload/" + key
else:
s3_client = _session.client("s3")
return s3_client.generate_presigned_url(
return _get_s3client().generate_presigned_url(
ExpiresIn=expires_in,
ClientMethod="get_object",
Params={"Bucket": BUCKET, "Key": key},
Params={"Bucket": config.BUCKET, "Key": key},
)
def delete(path: str):
if LOCAL_FILE_UPLOAD:
os.remove(os.path.join(UPLOAD_DIR, path))
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
os.remove(file_path)
else:
o = _session.resource("s3").Bucket(BUCKET).Object(path)
o.delete()
_get_s3client().delete_object(Bucket=config.BUCKET, Key=path)
def create_bucket_if_not_exists():
s3client = _get_s3client()
buckets = s3client.list_buckets()
for bucket in buckets["Buckets"]:
if bucket["Name"] == config.BUCKET:
LOG.i("Bucket already exists")
return
s3client.create_bucket(Bucket=config.BUCKET)
LOG.i(f"Bucket {config.BUCKET} created")

View File

@ -75,7 +75,7 @@ class RedisSessionStore(SessionInterface):
try:
data = pickle.loads(val)
return ServerSession(data, session_id=session_id)
except:
except Exception:
pass
return ServerSession(session_id=str(uuid.uuid4()))

View File

@ -2,6 +2,8 @@ import requests
from requests import RequestException
from app import config
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import EventContent, UserPlanChange
from app.log import LOG
from app.models import User
@ -31,3 +33,6 @@ def execute_subscription_webhook(user: User):
)
except RequestException as e:
LOG.error(f"Subscription request exception: {e}")
event = UserPlanChange(plan_end_time=sl_subscription_end)
EventDispatcher.send_event(user, EventContent(user_plan_change=event))

View File

@ -49,11 +49,11 @@ def random_string(length=10, include_digits=False):
def convert_to_id(s: str):
"""convert a string to id-like: remove space, remove special accent"""
s = s.replace(" ", "")
s = s.lower()
s = unidecode(s)
s = s.replace(" ", "")
return s
return s[:256]
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-."

192
cron.py
View File

@ -61,6 +61,11 @@ from app.pgp_utils import load_public_key_and_check, PGPException
from app.proton.utils import get_proton_partner
from app.utils import sanitize_email
from server import create_light_app
from tasks.cleanup_old_imports import cleanup_old_imports
from tasks.cleanup_old_jobs import cleanup_old_jobs
from tasks.cleanup_old_notifications import cleanup_old_notifications
DELETE_GRACE_DAYS = 30
def notify_trial_end():
@ -105,7 +110,7 @@ def delete_logs():
rows_to_delete = EmailLog.filter(EmailLog.created_at < cutoff_time).count()
expected_queries = int(rows_to_delete / batch_size)
sql = text(
f"DELETE FROM email_log WHERE id IN (SELECT id FROM email_log WHERE created_at < :cutoff_time order by created_at limit :batch_size)"
"DELETE FROM email_log WHERE id IN (SELECT id FROM email_log WHERE created_at < :cutoff_time order by created_at limit :batch_size)"
)
str_cutoff_time = cutoff_time.isoformat()
while total_deleted < rows_to_delete:
@ -161,7 +166,7 @@ def notify_premium_end():
send_email(
user.email,
f"Your subscription will end soon",
"Your subscription will end soon",
render(
"transactional/subscription-end.txt",
user=user,
@ -218,7 +223,7 @@ def notify_manual_sub_end():
LOG.d("Remind user %s that their manual sub is ending soon", user)
send_email(
user.email,
f"Your subscription will end soon",
"Your subscription will end soon",
render(
"transactional/manual-subscription-end.txt",
user=user,
@ -590,21 +595,21 @@ nb_total_bounced_last_24h: {stats_today.nb_total_bounced_last_24h} - {increase_p
"""
monitoring_report += "\n====================================\n"
monitoring_report += f"""
monitoring_report += """
# Account bounce report:
"""
for email, bounces in bounce_report():
monitoring_report += f"{email}: {bounces}\n"
monitoring_report += f"""\n
monitoring_report += """\n
# Alias creation report:
"""
for email, nb_alias, date in alias_creation_report():
monitoring_report += f"{email}, {date}: {nb_alias}\n"
monitoring_report += f"""\n
monitoring_report += """\n
# Full bounce detail report:
"""
monitoring_report += all_bounce_report()
@ -960,6 +965,9 @@ async def _hibp_check(api_key, queue):
This function to be ran simultaneously (multiple _hibp_check functions with different keys on the same queue) to make maximum use of multiple API keys.
"""
default_rate_sleep = (60.0 / config.HIBP_RPM) + 0.1
rate_sleep = default_rate_sleep
rate_hit_counter = 0
while True:
try:
alias_id = queue.get_nowait()
@ -967,9 +975,14 @@ async def _hibp_check(api_key, queue):
return
alias = Alias.get(alias_id)
# an alias can be deleted in the meantime
if not alias:
return
continue
user = alias.user
if user.disabled or not user.is_paid():
# Mark it as hibp done to skip it as if it had been checked
alias.hibp_last_check = arrow.utcnow()
Session.commit()
continue
LOG.d("Checking HIBP for %s", alias)
@ -981,7 +994,6 @@ async def _hibp_check(api_key, queue):
f"https://haveibeenpwned.com/api/v3/breachedaccount/{urllib.parse.quote(alias.email)}",
headers=request_headers,
)
if r.status_code == 200:
# Breaches found
alias.hibp_breaches = [
@ -989,20 +1001,27 @@ async def _hibp_check(api_key, queue):
]
if len(alias.hibp_breaches) > 0:
LOG.w("%s appears in HIBP breaches %s", alias, alias.hibp_breaches)
if rate_hit_counter > 0:
rate_hit_counter -= 1
elif r.status_code == 404:
# No breaches found
alias.hibp_breaches = []
elif r.status_code == 429:
# rate limited
LOG.w("HIBP rate limited, check alias %s in the next run", alias)
await asyncio.sleep(1.6)
return
rate_hit_counter += 1
rate_sleep = default_rate_sleep + (0.2 * rate_hit_counter)
if rate_hit_counter > 10:
LOG.w(f"HIBP rate limited too many times stopping with alias {alias}")
return
# Just sleep for a while
asyncio.sleep(5)
elif r.status_code > 500:
LOG.w("HIBP server 5** error %s", r.status_code)
return
else:
LOG.error(
"An error occured while checking alias %s: %s - %s",
"An error occurred while checking alias %s: %s - %s",
alias,
r.status_code,
r.text,
@ -1013,9 +1032,63 @@ async def _hibp_check(api_key, queue):
Session.add(alias)
Session.commit()
LOG.d("Updated breaches info for %s", alias)
LOG.d("Updated breach info for %s", alias)
await asyncio.sleep(rate_sleep)
await asyncio.sleep(1.6)
def get_alias_to_check_hibp(
oldest_hibp_allowed: arrow.Arrow,
user_ids_to_skip: list[int],
min_alias_id: int,
max_alias_id: int,
):
now = arrow.now()
alias_query = (
Session.query(Alias)
.join(User, User.id == Alias.user_id)
.join(Subscription, User.id == Subscription.user_id, isouter=True)
.join(ManualSubscription, User.id == ManualSubscription.user_id, isouter=True)
.join(AppleSubscription, User.id == AppleSubscription.user_id, isouter=True)
.join(
CoinbaseSubscription,
User.id == CoinbaseSubscription.user_id,
isouter=True,
)
.join(PartnerUser, User.id == PartnerUser.user_id, isouter=True)
.join(
PartnerSubscription,
PartnerSubscription.partner_user_id == PartnerUser.id,
isouter=True,
)
.filter(
or_(
Alias.hibp_last_check.is_(None),
Alias.hibp_last_check < oldest_hibp_allowed,
),
Alias.user_id.notin_(user_ids_to_skip),
Alias.enabled,
Alias.id >= min_alias_id,
Alias.id < max_alias_id,
User.disabled == False, # noqa: E712
User.enable_data_breach_check,
or_(
User.lifetime,
ManualSubscription.end_at > now,
Subscription.next_bill_date > now.date(),
AppleSubscription.expires_date > now,
CoinbaseSubscription.end_at > now,
PartnerSubscription.end_at > now,
),
)
)
if config.HIBP_SKIP_PARTNER_ALIAS:
alias_query = alias_query.filter(
Alias.flags.op("&")(Alias.FLAG_PARTNER_CREATED) == 0
)
for alias in (
alias_query.order_by(Alias.id.asc()).enable_eagerloads(False).yield_per(500)
):
yield alias
async def check_hibp():
@ -1038,40 +1111,49 @@ async def check_hibp():
Session.commit()
LOG.d("Updated list of known breaches")
LOG.d("Preparing list of aliases to check")
LOG.d("Getting the list of users to skip")
query = "select u.id, count(a.id) from users u, alias a where a.user_id=u.id group by u.id having count(a.id) > :max_alias"
rows = Session.execute(query, {"max_alias": config.HIBP_MAX_ALIAS_CHECK})
user_ids = [row[0] for row in rows]
LOG.d("Got %d users to skip" % len(user_ids))
LOG.d("Checking aliases")
queue = asyncio.Queue()
max_date = arrow.now().shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
for alias in (
Alias.filter(
or_(Alias.hibp_last_check.is_(None), Alias.hibp_last_check < max_date)
min_alias_id = 0
max_alias_id = Session.query(func.max(Alias.id)).scalar()
step = 10000
now = arrow.now()
oldest_hibp_allowed = now.shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
alias_checked = 0
for alias_batch_id in range(min_alias_id, max_alias_id, step):
for alias in get_alias_to_check_hibp(
oldest_hibp_allowed, user_ids, alias_batch_id, alias_batch_id + step
):
await queue.put(alias.id)
alias_checked += queue.qsize()
LOG.d(
f"Need to check about {queue.qsize()} aliases in this loop {alias_batch_id}/{max_alias_id}"
)
.filter(Alias.enabled)
.order_by(Alias.hibp_last_check.asc())
.yield_per(500)
.enable_eagerloads(False)
):
await queue.put(alias.id)
LOG.d("Need to check about %s aliases", queue.qsize())
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
)
)
)
checkers.append(checker)
checkers.append(checker)
# Wait until all checking processes are done
for checker in checkers:
await checker
# Wait until all checking processes are done
for checker in checkers:
await checker
LOG.d("Done checking HIBP API for aliases in breaches")
LOG.d(f"Done checking {alias_checked} HIBP API for aliases in breaches")
def notify_hibp():
@ -1099,14 +1181,14 @@ def notify_hibp():
)
LOG.d(
f"Send new breaches found email to %s for %s breaches aliases",
"Send new breaches found email to %s for %s breaches aliases",
user,
len(breached_aliases),
)
send_email(
user.email,
f"You were in a data breach",
"You were in a data breach",
render(
"transactional/hibp-new-breaches.txt.jinja2",
user=user,
@ -1126,18 +1208,30 @@ def notify_hibp():
Session.commit()
def clear_users_scheduled_to_be_deleted():
def clear_users_scheduled_to_be_deleted(dry_run=False):
users = User.filter(
and_(User.delete_on.isnot(None), User.delete_on < arrow.now())
and_(
User.delete_on.isnot(None),
User.delete_on <= arrow.now().shift(days=-DELETE_GRACE_DAYS),
)
).all()
for user in users:
LOG.i(
f"Scheduled deletion of user {user} with scheduled delete on {user.delete_on}"
)
if dry_run:
continue
User.delete(user.id)
Session.commit()
def delete_old_data():
oldest_valid = arrow.now().shift(days=-config.KEEP_OLD_DATA_DAYS)
cleanup_old_imports(oldest_valid)
cleanup_old_jobs(oldest_valid)
cleanup_old_notifications(oldest_valid)
if __name__ == "__main__":
LOG.d("Start running cronjob")
parser = argparse.ArgumentParser()
@ -1152,6 +1246,7 @@ if __name__ == "__main__":
"notify_manual_subscription_end",
"notify_premium_end",
"delete_logs",
"delete_old_data",
"poll_apple_subscription",
"sanity_check",
"delete_old_monitoring",
@ -1180,6 +1275,9 @@ if __name__ == "__main__":
elif args.job == "delete_logs":
LOG.d("Deleted Logs")
delete_logs()
elif args.job == "delete_old_data":
LOG.d("Delete old data")
delete_old_data()
elif args.job == "poll_apple_subscription":
LOG.d("Poll Apple Subscriptions")
poll_apple_subscription()
@ -1206,4 +1304,4 @@ if __name__ == "__main__":
load_unsent_mails_from_fs_and_resend()
elif args.job == "delete_scheduled_users":
LOG.d("Deleting users scheduled to be deleted")
clear_users_scheduled_to_be_deleted()
clear_users_scheduled_to_be_deleted(dry_run=True)

View File

@ -37,6 +37,12 @@ jobs:
schedule: "15 5 * * *"
captureStderr: true
- name: SimpleLogin Delete Old data
command: python /code/cron.py -j delete_old_data
shell: /bin/bash
schedule: "30 5 * * *"
captureStderr: true
- name: SimpleLogin Poll Apple Subscriptions
command: python /code/cron.py -j poll_apple_subscription
shell: /bin/bash
@ -62,7 +68,7 @@ jobs:
captureStderr: true
- name: SimpleLogin delete users scheduled to be deleted
command: echo disabled_user_deletion #python /code/cron.py -j delete_scheduled_users
command: python /code/cron.py -j delete_scheduled_users
shell: /bin/bash
schedule: "15 11 * * *"
captureStderr: true

View File

@ -388,7 +388,7 @@ Input:
- (Optional but recommended) `hostname` passed in query string
- Request Message Body in json (`Content-Type` is `application/json`)
- alias_prefix: string. The first part of the alias that user can choose.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v4/alias/options` endpoint.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v5/alias/options` endpoint.
- mailbox_ids: list of mailbox_id that "owns" this alias
- (Optional) note: alias note
- (Optional) name: alias name

View File

@ -53,7 +53,7 @@ from flanker.addresslib.address import EmailAddress
from sqlalchemy.exc import IntegrityError
from app import pgp_utils, s3, config
from app.alias_utils import try_auto_create
from app.alias_utils import try_auto_create, change_alias_status
from app.config import (
EMAIL_DOMAIN,
URL,
@ -235,17 +235,17 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
contact.mail_from = mail_from
Session.commit()
else:
try:
contact_email_for_reply = (
contact_email if is_valid_email(contact_email) else ""
)
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias.id,
website_email=contact_email,
name=contact_name,
mail_from=mail_from,
reply_email=generate_reply_email(contact_email, alias)
if is_valid_email(contact_email)
else NOREPLY,
reply_email=generate_reply_email(contact_email_for_reply, alias),
automatic_created=True,
)
if not contact_email:
@ -637,6 +637,10 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
user = alias.user
if not user.is_active():
LOG.w(f"User {user} has been soft deleted")
return False, status.E502
if not user.can_send_or_receive():
LOG.i(f"User {user} cannot receive emails")
if should_ignore_bounce(envelope.mail_from):
@ -871,6 +875,7 @@ def forward_email_to_mailbox(
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
headers.LIST_UNSUBSCRIBE,
headers.LIST_UNSUBSCRIBE_POST,
] + headers.MIME_HEADERS
@ -1056,6 +1061,9 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
if not contact:
LOG.w(f"No contact with {reply_email} as reverse alias")
return False, status.E502
if not contact.user.is_active():
LOG.w(f"User {contact.user} has been soft deleted")
return False, status.E502
alias = contact.alias
alias_address: str = contact.alias.email
@ -1172,6 +1180,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
]
+ headers.MIME_HEADERS,
)
@ -1197,7 +1206,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
)
# replace reverse alias by real address for all contacts
for (reply_email, website_email) in contact_query.values(
for reply_email, website_email in contact_query.values(
Contact.reply_email, Contact.website_email
):
msg = replace(msg, reply_email, website_email)
@ -1576,7 +1585,7 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
LOG.w(
f"Disable alias {alias} because {reason}. {alias.mailboxes} {alias.user}. Last contact {contact}"
)
alias.enabled = False
change_alias_status(alias, enabled=False)
Notification.create(
user_id=user.id,
@ -1884,24 +1893,30 @@ def handle_transactional_bounce(
envelope: Envelope, msg, rcpt_to, transactional_id=None
):
LOG.d("handle transactional bounce sent to %s", rcpt_to)
if transactional_id is None:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
# parse the TransactionalEmail
transactional_id = transactional_id or parse_id_from_bounce(rcpt_to)
transactional = TransactionalEmail.get(transactional_id)
# a transaction might have been deleted in delete_logs()
if transactional:
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
if not transactional:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
@ -1922,6 +1937,9 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
contact,
alias,
)
if not email_log.user.is_active():
LOG.d(f"User {email_log.user} is not active")
return status.E510
if email_log.is_reply:
content_type = msg.get_content_type().lower()
@ -1952,7 +1970,7 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
for is_delivered, smtp_status in handle_forward(envelope, msg, alias.email):
res.append((is_delivered, smtp_status))
for (is_success, smtp_status) in res:
for is_success, smtp_status in res:
# Consider all deliveries successful if 1 delivery is successful
if is_success:
return smtp_status
@ -1983,6 +2001,9 @@ def send_no_reply_response(mail_from: str, msg: Message):
if not mailbox:
LOG.d("Unknown sender. Skipping reply from {}".format(NOREPLY))
return
if not mailbox.user.is_active():
LOG.d(f"User {mailbox.user} is soft-deleted. Skipping sending reply response")
return
send_email_at_most_times(
mailbox.user,
ALERT_TO_NOREPLY,
@ -2021,10 +2042,11 @@ def handle(envelope: Envelope, msg: Message) -> str:
return status.E204
# sanitize email headers
sanitize_header(msg, "from")
sanitize_header(msg, "to")
sanitize_header(msg, "cc")
sanitize_header(msg, "reply-to")
sanitize_header(msg, headers.FROM)
sanitize_header(msg, headers.TO)
sanitize_header(msg, headers.CC)
sanitize_header(msg, headers.REPLY_TO)
sanitize_header(msg, headers.MESSAGE_ID)
LOG.d(
"==>> Handle mail_from:%s, rcpt_tos:%s, header_from:%s, header_to:%s, "
@ -2272,7 +2294,7 @@ def handle(envelope: Envelope, msg: Message) -> str:
if nb_success > 0 and nb_non_success > 0:
LOG.e(f"some deliveries fail and some success, {mail_from}, {rcpt_tos}, {res}")
for (is_success, smtp_status) in res:
for is_success, smtp_status in res:
# Consider all deliveries successful if 1 delivery is successful
if is_success:
return smtp_status

64
event_listener.py Normal file
View File

@ -0,0 +1,64 @@
import argparse
from enum import Enum
from sys import argv, exit
from app.config import DB_URI
from app.log import LOG
from events.runner import Runner
from events.event_source import DeadLetterEventSource, PostgresEventSource
from events.event_sink import ConsoleEventSink, HttpEventSink
class Mode(Enum):
DEAD_LETTER = "dead_letter"
LISTENER = "listener"
@staticmethod
def from_str(value: str):
if value == Mode.DEAD_LETTER.value:
return Mode.DEAD_LETTER
elif value == Mode.LISTENER.value:
return Mode.LISTENER
else:
raise ValueError(f"Invalid mode: {value}")
def main(mode: Mode, dry_run: bool):
if mode == Mode.DEAD_LETTER:
LOG.i("Using DeadLetterEventSource")
source = DeadLetterEventSource()
elif mode == Mode.LISTENER:
LOG.i("Using PostgresEventSource")
source = PostgresEventSource(DB_URI)
else:
raise ValueError(f"Invalid mode: {mode}")
if dry_run:
LOG.i("Starting with ConsoleEventSink")
sink = ConsoleEventSink()
else:
LOG.i("Starting with HttpEventSink")
sink = HttpEventSink()
runner = Runner(source=source, sink=sink)
runner.run()
def args():
parser = argparse.ArgumentParser(description="Run event listener")
parser.add_argument(
"mode",
help="Mode to run",
choices=[Mode.DEAD_LETTER.value, Mode.LISTENER.value],
)
parser.add_argument("--dry-run", help="Dry run mode", action="store_true")
return parser.parse_args()
if __name__ == "__main__":
if len(argv) < 2:
print("Invalid usage. Pass 'listener' or 'dead_letter' as argument")
exit(1)
args = args()
main(Mode.from_str(args.mode), args.dry_run)

0
events/__init__.py Normal file
View File

42
events/event_sink.py Normal file
View File

@ -0,0 +1,42 @@
import requests
from abc import ABC, abstractmethod
from app.config import EVENT_WEBHOOK, EVENT_WEBHOOK_SKIP_VERIFY_SSL
from app.log import LOG
from app.models import SyncEvent
class EventSink(ABC):
@abstractmethod
def process(self, event: SyncEvent) -> bool:
pass
class HttpEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
if not EVENT_WEBHOOK:
LOG.warning("Skipping sending event because there is no webhook configured")
return False
LOG.info(f"Sending event {event.id} to {EVENT_WEBHOOK}")
res = requests.post(
url=EVENT_WEBHOOK,
data=event.content,
headers={"Content-Type": "application/x-protobuf"},
verify=not EVENT_WEBHOOK_SKIP_VERIFY_SSL,
)
if res.status_code != 200:
LOG.warning(
f"Failed to send event to webhook: {res.status_code} {res.text}"
)
return False
else:
LOG.info(f"Event {event.id} sent successfully to webhook")
return True
class ConsoleEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
LOG.info(f"Handling event {event.id}")
return True

100
events/event_source.py Normal file
View File

@ -0,0 +1,100 @@
import arrow
import newrelic.agent
import psycopg2
import select
from abc import ABC, abstractmethod
from app.log import LOG
from app.models import SyncEvent
from app.events.event_dispatcher import NOTIFICATION_CHANNEL
from time import sleep
from typing import Callable, NoReturn
_DEAD_LETTER_THRESHOLD_MINUTES = 10
_DEAD_LETTER_INTERVAL_SECONDS = 30
_POSTGRES_RECONNECT_INTERVAL_SECONDS = 5
class EventSource(ABC):
@abstractmethod
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
pass
class PostgresEventSource(EventSource):
def __init__(self, connection_string: str):
self.__connection_string = connection_string
self.__connect()
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
while True:
try:
self.__listen(on_event)
except Exception as e:
LOG.warn(f"Error listening to events: {e}")
sleep(_POSTGRES_RECONNECT_INTERVAL_SECONDS)
self.__connect()
def __listen(self, on_event: Callable[[SyncEvent], NoReturn]):
self.__connection.set_isolation_level(
psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT
)
cursor = self.__connection.cursor()
cursor.execute(f"LISTEN {NOTIFICATION_CHANNEL};")
while True:
if select.select([self.__connection], [], [], 5) != ([], [], []):
self.__connection.poll()
while self.__connection.notifies:
notify = self.__connection.notifies.pop(0)
LOG.debug(
f"Got NOTIFY: pid={notify.pid} channel={notify.channel} payload={notify.payload}"
)
try:
webhook_id = int(notify.payload)
event = SyncEvent.get_by(id=webhook_id)
if event is not None:
if event.mark_as_taken():
on_event(event)
else:
LOG.info(
f"Event {event.id} was handled by another runner"
)
else:
LOG.info(f"Could not find event with id={notify.payload}")
except Exception as e:
LOG.warn(f"Error getting event: {e}")
def __connect(self):
self.__connection = psycopg2.connect(self.__connection_string)
from app.db import Session
Session.close()
class DeadLetterEventSource(EventSource):
@newrelic.agent.background_task()
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
while True:
try:
threshold = arrow.utcnow().shift(
minutes=-_DEAD_LETTER_THRESHOLD_MINUTES
)
events = SyncEvent.get_dead_letter(older_than=threshold)
if events:
LOG.info(f"Got {len(events)} dead letter events")
if events:
newrelic.agent.record_custom_metric(
"Custom/dead_letter_events_to_process", len(events)
)
for event in events:
on_event(event)
else:
LOG.debug("No dead letter events")
sleep(_DEAD_LETTER_INTERVAL_SECONDS)
except Exception as e:
LOG.warn(f"Error getting dead letter event: {e}")
sleep(_DEAD_LETTER_INTERVAL_SECONDS)

42
events/runner.py Normal file
View File

@ -0,0 +1,42 @@
import arrow
import newrelic.agent
from app.log import LOG
from app.models import SyncEvent
from events.event_sink import EventSink
from events.event_source import EventSource
class Runner:
def __init__(self, source: EventSource, sink: EventSink):
self.__source = source
self.__sink = sink
def run(self):
self.__source.run(self.__on_event)
@newrelic.agent.background_task()
def __on_event(self, event: SyncEvent):
try:
event_created_at = event.created_at
start_time = arrow.now()
success = self.__sink.process(event)
if success:
event_id = event.id
SyncEvent.delete(event.id, commit=True)
LOG.info(f"Marked {event_id} as done")
end_time = arrow.now() - start_time
time_between_taken_and_created = start_time - event_created_at
newrelic.agent.record_custom_metric("Custom/sync_event_processed", 1)
newrelic.agent.record_custom_metric(
"Custom/sync_event_process_time", end_time.total_seconds()
)
newrelic.agent.record_custom_metric(
"Custom/sync_event_elapsed_time",
time_between_taken_and_created.total_seconds(),
)
except Exception as e:
LOG.warn(f"Exception processing event [id={event.id}]: {e}")
newrelic.agent.record_custom_metric("Custom/sync_event_failed", 1)

View File

@ -116,6 +116,14 @@ WORDS_FILE_PATH=local_data/test_words.txt
# CONNECT_WITH_PROTON=true
# CONNECT_WITH_PROTON_COOKIE_NAME=to_fill
# Login with OIDC
# CONNECT_WITH_OIDC_ICON=fa-github
# OIDC_WELL_KNOWN_URL=to_fill
# OIDC_SCOPES=openid email profile
# OIDC_NAME_FIELD=name
# OIDC_CLIENT_ID=to_fill
# OIDC_CLIENT_SECRET=to_fill
# Flask profiler
# FLASK_PROFILER_PATH=/tmp/flask-profiler.sql
# FLASK_PROFILER_PASSWORD=password

View File

@ -192,7 +192,6 @@ amigos
amines
amnion
amoeba
amoral
amount
amours
ampere
@ -215,7 +214,6 @@ animus
anions
ankles
anklet
annals
anneal
annoys
annual
@ -364,7 +362,6 @@ auntie
aureus
aurora
author
autism
autumn
avails
avatar
@ -638,14 +635,12 @@ bigwig
bijoux
bikers
biking
bikini
bilges
bilked
bilker
billed
billet
billow
bimbos
binary
binder
binged
@ -710,8 +705,6 @@ blocks
blokes
blonde
blonds
bloods
bloody
blooms
bloops
blotch
@ -817,8 +810,6 @@ bounds
bounty
bovine
bovver
bowels
bowers
bowing
bowled
bowleg
@ -827,10 +818,8 @@ bowman
bowmen
bowwow
boxcar
boxers
boxier
boxing
boyish
braced
bracer
braces
@ -861,7 +850,6 @@ breach
breads
breaks
breams
breast
breath
breech
breeds
@ -872,9 +860,6 @@ brevet
brewed
brewer
briars
bribed
briber
bribes
bricks
bridal
brides
@ -926,13 +911,7 @@ buffed
buffer
buffet
bugged
bugger
bugled
bugler
bugles
builds
bulged
bulges
bulked
bulled
bullet
@ -1340,8 +1319,6 @@ clingy
clinic
clinks
clique
cloaca
cloaks
cloche
clocks
clomps
@ -1448,7 +1425,6 @@ comply
compos
conchs
concur
condom
condor
condos
coneys
@ -1568,8 +1544,6 @@ cranes
cranks
cranky
cranny
crapes
crappy
crated
crater
crates
@ -1585,7 +1559,6 @@ crazes
creaks
creaky
creams
creamy
crease
create
creche
@ -1594,8 +1567,6 @@ credos
creeds
creeks
creels
creeps
creepy
cremes
creole
crepes
@ -1728,9 +1699,6 @@ dainty
daises
damage
damask
dammed
dammit
damned
damped
dampen
damper
@ -1754,7 +1722,6 @@ darers
daring
darken
darker
darkie
darkly
darned
darner
@ -1763,8 +1730,6 @@ darter
dashed
dasher
dashes
daters
dating
dative
daubed
dauber
@ -1921,7 +1886,6 @@ dharma
dhotis
diadem
dialog
diaper
diatom
dibble
dicier
@ -1943,7 +1907,6 @@ digits
diking
diktat
dilate
dildos
dilute
dimity
dimmed
@ -2058,7 +2021,6 @@ dotted
double
doubly
doubts
douche
doughy
dourer
dourly
@ -2139,15 +2101,6 @@ duenna
duffed
duffer
dugout
dulcet
dulled
duller
dumber
dumbly
dumbos
dumdum
dumped
dumper
dunces
dunged
dunked
@ -2285,7 +2238,6 @@ endows
endued
endues
endure
enemas
energy
enfold
engage
@ -2333,7 +2285,6 @@ erects
ermine
eroded
erodes
erotic
errand
errant
errata
@ -2344,7 +2295,6 @@ eructs
erupts
escape
eschew
escort
escrow
escudo
espied
@ -2363,7 +2313,6 @@ ethnic
etudes
euchre
eulogy
eunuch
eureka
evaded
evader
@ -2392,7 +2341,6 @@ exempt
exerts
exeunt
exhale
exhort
exhume
exiled
exiles
@ -2415,7 +2363,6 @@ extant
extend
extent
extols
extort
extras
exuded
exudes
@ -2440,7 +2387,6 @@ faeces
faerie
faffed
fagged
faggot
failed
faille
fainer
@ -2473,18 +2419,10 @@ faring
farmed
farmer
farrow
farted
fascia
fasted
fasten
faster
father
fathom
fating
fatsos
fatten
fatter
fatwas
faucet
faults
faulty
@ -2532,7 +2470,6 @@ fesses
festal
fester
feting
fetish
fetter
fettle
feudal
@ -2617,9 +2554,7 @@ flaked
flakes
flambe
flamed
flamer
flames
flange
flanks
flared
flares
@ -2754,8 +2689,6 @@ franks
frappe
frauds
frayed
freaks
freaky
freely
freest
freeze
@ -2795,8 +2728,6 @@ fryers
frying
ftpers
ftping
fucked
fucker
fuddle
fudged
fudges
@ -2891,10 +2822,7 @@ gasbag
gashed
gashes
gasket
gasman
gasmen
gasped
gassed
gasses
gateau
gather
@ -3104,7 +3032,6 @@ grimed
grimes
grimly
grinds
gringo
griped
griper
gripes
@ -3186,8 +3113,6 @@ gypsum
gyrate
gyving
habits
hacked
hacker
hackle
hadith
haggis
@ -3195,8 +3120,6 @@ haggle
hailed
hairdo
haired
hajjes
hajjis
halest
haling
halite
@ -3223,11 +3146,8 @@ happen
haptic
harass
harden
harder
hardly
harems
haring
harked
harlot
harmed
harped
@ -3407,7 +3327,6 @@ hoofed
hoofer
hookah
hooked
hooker
hookup
hooped
hoopla
@ -3459,8 +3378,6 @@ huffed
hugely
hugest
hugged
hulled
huller
humane
humans
humble
@ -3667,8 +3584,6 @@ jacket
jading
jagged
jaguar
jailed
jailer
jalopy
jammed
jangle
@ -3689,8 +3604,6 @@ jejune
jelled
jellos
jennet
jerked
jerkin
jersey
jested
jester
@ -3814,11 +3727,7 @@ kidded
kidder
kiddie
kiddos
kidnap
kidney
killed
killer
kilned
kilted
kilter
kimono
@ -3827,15 +3736,11 @@ kinder
kindle
kindly
kingly
kinked
kiosks
kipped
kipper
kirsch
kismet
kissed
kisser
kisses
kiting
kitsch
kitted
@ -3847,10 +3752,6 @@ kluges
klutzy
knacks
knaves
kneads
kneels
knells
knifed
knifes
knight
knives
@ -4210,8 +4111,6 @@ lunges
lupine
lupins
luring
lurked
lurker
lusher
lushes
lushly
@ -4608,7 +4507,6 @@ muggle
mukluk
mulcts
mulish
mullah
mulled
mullet
mumble
@ -4721,9 +4619,6 @@ nickel
nicker
nickle
nieces
niggas
niggaz
nigger
niggle
nigher
nights
@ -4736,7 +4631,6 @@ ninjas
ninths
nipped
nipper
nipple
nitric
nitwit
nixing
@ -4781,15 +4675,6 @@ nozzle
nuance
nubbin
nubile
nuclei
nudest
nudged
nudges
nudism
nudist
nudity
nugget
nuking
numbed
number
numbly
@ -4804,7 +4689,6 @@ nutter
nuzzle
nybble
nylons
nympho
nymphs
oafish
oaring
@ -4885,7 +4769,6 @@ opting
option
opuses
oracle
orally
orange
orated
orates
@ -4897,7 +4780,6 @@ ordeal
orders
ordure
organs
orgasm
orgies
oriels
orient
@ -4993,10 +4875,6 @@ pander
panels
panics
panned
panted
pantie
pantos
pantry
papacy
papaya
papers
@ -5078,7 +4956,6 @@ pebble
pebbly
pecans
pecked
pecker
pectic
pectin
pedalo
@ -5151,9 +5028,6 @@ phenom
phials
phlegm
phloem
phobia
phobic
phoebe
phoned
phones
phoney
@ -5228,9 +5102,6 @@ piques
piracy
pirate
pirogi
pissed
pisser
pisses
pistes
pistil
pistol
@ -5311,8 +5182,6 @@ pogrom
points
pointy
poised
poises
poison
pokers
pokeys
pokier
@ -5422,7 +5291,6 @@ preyed
priced
prices
pricey
pricks
prided
prides
priers
@ -5602,14 +5470,9 @@ rabbit
rabble
rabies
raceme
racers
racial
racier
racily
racing
racism
racist
racked
racket
radars
radial
@ -5661,8 +5524,6 @@ rapers
rapids
rapier
rapine
raping
rapist
rapped
rappel
rapper
@ -5747,7 +5608,6 @@ recoup
rectal
rector
rectos
rectum
recurs
recuse
redact
@ -5891,7 +5751,6 @@ resume
retail
retain
retake
retard
retell
retest
retied
@ -6125,8 +5984,6 @@ sadden
sadder
saddle
sadhus
sadism
sadist
safari
safely
safest
@ -6364,16 +6221,6 @@ severs
sewage
sewers
sewing
sexier
sexily
sexing
sexism
sexist
sexpot
sextet
sexton
sexual
shabby
shacks
shaded
shades
@ -6383,10 +6230,7 @@ shaggy
shaken
shaker
shakes
shalom
shaman
shamed
shames
shandy
shanks
shanty
@ -6432,7 +6276,6 @@ shirks
shirrs
shirts
shirty
shitty
shiver
shoals
shoats
@ -6575,9 +6418,6 @@ slangy
slants
slated
slates
slaved
slaver
slaves
slayed
slayer
sleaze
@ -6672,7 +6512,6 @@ snarks
snarky
snarls
snarly
snatch
snazzy
sneaks
sneaky
@ -6716,7 +6555,6 @@ socket
sodded
sodden
sodium
sodomy
soever
soften
softer
@ -7468,7 +7306,6 @@ torrid
torsos
tortes
tossed
tosser
tosses
tossup
totals
@ -7686,7 +7523,6 @@ unhook
unhurt
unions
unique
unisex
unison
united
unites
@ -7793,7 +7629,6 @@ vacant
vacate
vacuum
vagary
vagina
vaguer
vainer
vainly
@ -7930,9 +7765,6 @@ votive
vowels
vowing
voyage
voyeur
vulgar
vulvae
wabbit
wacker
wackos
@ -7975,7 +7807,6 @@ wander
wangle
waning
wanked
wanker
wanner
wanted
wanton

View File

@ -1944,7 +1944,6 @@ dosage
dose
dotted
doubling
douche
dove
down
dowry
@ -3015,7 +3014,6 @@ groom
groove
grooving
groovy
grope
ground
grouped
grout
@ -3135,7 +3133,6 @@ happiness
happy
harbor
hardcopy
hardcore
hardcover
harddisk
hardened
@ -6553,7 +6550,6 @@ swimmer
swimming
swimsuit
swimwear
swinger
swinging
swipe
swirl
@ -7464,9 +7460,7 @@ villain
vindicate
vineyard
vintage
violate
violation
violator
violet
violin
viper

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,31 @@
"""empty message
Revision ID: 4bc54632d9aa
Revises: 46ecb648a47e
Create Date: 2023-11-07 14:02:17.610226
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '4bc54632d9aa'
down_revision = '46ecb648a47e'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('ix_newsletter_subject', table_name='newsletter')
op.create_index(op.f('ix_newsletter_subject'), 'newsletter', ['subject'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_newsletter_subject'), table_name='newsletter')
op.create_index('ix_newsletter_subject', 'newsletter', ['subject'], unique=True)
# ### end Alembic commands ###

View File

@ -0,0 +1,29 @@
"""empty message
Revision ID: 818b0a956205
Revises: 4bc54632d9aa
Create Date: 2024-02-01 10:43:46.253184
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '818b0a956205'
down_revision = '4bc54632d9aa'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('alias', sa.Column('last_email_log_id', sa.Integer(), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('alias', 'last_email_log_id')
# ### end Alembic commands ###

View File

@ -0,0 +1,48 @@
"""empty message
Revision ID: 52510a633d6f
Revises: 818b0a956205
Create Date: 2024-03-12 12:46:24.161644
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "52510a633d6f"
down_revision = "818b0a956205"
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"alias", sa.Column("flags", sa.BigInteger(), server_default="0", nullable=False)
)
with op.get_context().autocommit_block():
op.create_index(op.f("ix_alias_flags"), "alias", ["flags"], unique=False)
op.create_index(op.f("ix_job_state"), "job", ["state"], unique=False)
op.create_index(
"ix_state_run_at_taken_at",
"job",
["state", "run_at", "taken_at"],
unique=False,
)
op.create_index(
op.f("ix_notification_user_id"), "notification", ["user_id"], unique=False
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
with op.get_context().autocommit_block():
op.drop_index(op.f("ix_notification_user_id"), table_name="notification")
op.drop_index("ix_state_run_at_taken_at", table_name="job")
op.drop_index(op.f("ix_job_state"), table_name="job")
op.drop_index(op.f("ix_alias_flags"), table_name="alias")
op.drop_column("alias", "flags")
# ### end Alembic commands ###

View File

@ -0,0 +1,29 @@
"""empty message
Revision ID: fa2f19bb4e5a
Revises: 52510a633d6f
Create Date: 2024-04-09 13:12:26.305340
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'fa2f19bb4e5a'
down_revision = '52510a633d6f'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('users', sa.Column('enable_data_breach_check', sa.Boolean(), server_default='0', nullable=False))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('users', 'enable_data_breach_check')
# ### end Alembic commands ###

View File

@ -0,0 +1,38 @@
"""Create sync_event table
Revision ID: 06a9a7133445
Revises: fa2f19bb4e5a
Create Date: 2024-05-17 13:11:20.402259
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '06a9a7133445'
down_revision = 'fa2f19bb4e5a'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('sync_event',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('created_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=False),
sa.Column('updated_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.Column('content', sa.LargeBinary(), nullable=False),
sa.Column('taken_time', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_sync_event_created_at'), 'sync_event', ['created_at'], unique=False)
op.create_index(op.f('ix_sync_event_taken_time'), 'sync_event', ['taken_time'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('sync_event')
# ### end Alembic commands ###

View File

@ -4,6 +4,7 @@ import subprocess
from time import sleep
from typing import List, Dict
import arrow
import newrelic.agent
from app.db import Session
@ -93,11 +94,44 @@ def log_nb_db_connection():
newrelic.agent.record_custom_metric("Custom/nb_db_connections", nb_connection)
@newrelic.agent.background_task()
def log_pending_to_process_events():
r = Session.execute("select count(*) from sync_event WHERE taken_time IS NULL;")
events_pending = list(r)[0][0]
LOG.d("number of events pending to process %s", events_pending)
newrelic.agent.record_custom_metric(
"Custom/sync_events_pending_to_process", events_pending
)
@newrelic.agent.background_task()
def log_events_pending_dead_letter():
since = arrow.now().shift(minutes=-10).datetime
r = Session.execute(
"""
SELECT COUNT(*)
FROM sync_event
WHERE (taken_time IS NOT NULL AND taken_time < :since)
OR (taken_time IS NULL AND created_at < :since)
""",
{"since": since},
)
events_pending = list(r)[0][0]
LOG.d("number of events pending dead letter %s", events_pending)
newrelic.agent.record_custom_metric(
"Custom/sync_events_pending_dead_letter", events_pending
)
if __name__ == "__main__":
exporter = MetricExporter(get_newrelic_license())
while True:
log_postfix_metrics()
log_nb_db_connection()
log_pending_to_process_events()
log_events_pending_dead_letter()
Session.close()
exporter.run()

View File

@ -0,0 +1,44 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.models import Alias
from app.db import Session
parser = argparse.ArgumentParser(
prog="Backfill alias", description="Backfill alias las use"
)
parser.add_argument(
"-s", "--start_alias_id", default=0, type=int, help="Initial alias_id"
)
parser.add_argument("-e", "--end_alias_id", default=0, type=int, help="Last alias_id")
args = parser.parse_args()
alias_id_start = args.start_alias_id
max_alias_id = args.end_alias_id
if max_alias_id == 0:
max_alias_id = Session.query(func.max(Alias.id)).scalar()
print(f"Checking alias {alias_id_start} to {max_alias_id}")
step = 1000
el_query = "SELECT alias_id, MAX(id) from email_log where alias_id>=:start AND alias_id < :end GROUP BY alias_id"
alias_query = "UPDATE alias set last_email_log_id = :el_id where id = :alias_id"
updated = 0
start_time = time.time()
for batch_start in range(alias_id_start, max_alias_id, step):
rows = Session.execute(el_query, {"start": batch_start, "end": batch_start + step})
for row in rows:
Session.execute(alias_query, {"alias_id": row[0], "el_id": row[1]})
Session.commit()
updated += 1
elapsed = time.time() - start_time
time_per_alias = elapsed / (updated + 1)
last_batch_id = batch_start + step
remaining = max_alias_id - last_batch_id
time_remaining = (max_alias_id - last_batch_id) * time_per_alias
hours_remaining = time_remaining / 3600.0
print(
f"\rAlias {batch_start}/{max_alias_id} {updated} {hours_remaining:.2f}hrs remaining"
)
print("")

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
import argparse
import random
import time
from sqlalchemy import func
from app import config
from app.models import Alias, Contact
from app.db import Session
parser = argparse.ArgumentParser(
prog=f"Replace {config.NOREPLY}",
description=f"Replace {config.NOREPLY} from contacts reply email",
)
args = parser.parse_args()
max_alias_id: int = Session.query(func.max(Alias.id)).scalar()
start = time.time()
tests = 1000
for i in range(tests):
alias = (
Alias.filter(Alias.id > int(random.random() * max_alias_id))
.order_by(Alias.id.asc())
.limit(1)
.first()
)
contact = Contact.filter_by(alias_id=alias.id).order_by(Contact.id.asc()).first()
mailboxes = alias.mailboxes
user = alias.user
if i % 10:
print("{i} -> {alias.id}")
end = time.time()
time_taken = end - start
print(f"Took {time_taken} -> {time_taken/tests} per test")

View File

@ -0,0 +1,56 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.models import Alias, SLDomain
from app.db import Session
parser = argparse.ArgumentParser(
prog="Mark partner created aliases with the PARTNER_CREATED flag",
)
parser.add_argument(
"-s", "--start_alias_id", default=0, type=int, help="Initial alias_id"
)
parser.add_argument("-e", "--end_alias_id", default=0, type=int, help="Last alias_id")
args = parser.parse_args()
alias_id_start = args.start_alias_id
max_alias_id = args.end_alias_id
if max_alias_id == 0:
max_alias_id = Session.query(func.max(Alias.id)).scalar()
print(f"Updating aliases from {alias_id_start} to {max_alias_id}")
domains = SLDomain.filter(SLDomain.partner_id.isnot(None)).all()
cond = [f"email like '%{domain.domain}'" for domain in domains]
sql_or_cond = " OR ".join(cond)
sql = f"UPDATE alias set flags = (flags | :flag) WHERE id >= :start and id<:end and flags & :flag = 0 and ({sql_or_cond})"
print(sql)
step = 1000
updated = 0
start_time = time.time()
for batch_start in range(alias_id_start, max_alias_id, step):
updated += Session.execute(
sql,
{
"start": batch_start,
"end": batch_start + step,
"flag": Alias.FLAG_PARTNER_CREATED,
},
).rowcount
elapsed = time.time() - start_time
time_per_alias = elapsed / (batch_start - alias_id_start + step)
last_batch_id = batch_start + step
remaining = max_alias_id - last_batch_id
time_remaining = (max_alias_id - last_batch_id) * time_per_alias
hours_remaining = time_remaining / 3600.0
percent = int(
((batch_start - alias_id_start) * 100) / (max_alias_id - alias_id_start)
)
print(
f"\rAlias {batch_start}/{max_alias_id} {percent}% {updated} updated {hours_remaining:.2f}hrs remaining"
)
print(f"Updated aliases up to {max_alias_id}")

View File

@ -0,0 +1,53 @@
#!/usr/bin/env python3
import argparse
import time
from app import config
from app.email_utils import generate_reply_email
from app.email_validation import is_valid_email
from app.models import Alias
from app.db import Session
parser = argparse.ArgumentParser(
prog=f"Replace {config.NOREPLY}",
description=f"Replace {config.NOREPLY} from contacts reply email",
)
args = parser.parse_args()
el_query = "SELECT id, alias_id, website_email from contact where id>=:last_id AND reply_email=:reply_email ORDER BY id ASC LIMIT :step"
update_query = "UPDATE contact SET reply_email=:reply_email WHERE id=:contact_id "
updated = 0
start_time = time.time()
step = 100
last_id = 0
print(f"Replacing contacts with reply_email={config.NOREPLY}")
while True:
rows = Session.execute(
el_query, {"last_id": last_id, "reply_email": config.NOREPLY, "step": step}
)
loop_updated = 0
for row in rows:
contact_id = row[0]
alias_id = row[1]
last_id = contact_id
website_email = row[2]
contact_email_for_reply = website_email if is_valid_email(website_email) else ""
alias = Alias.get(alias_id)
if alias is None:
print(f"CANNOT find alias {alias_id} in database for contact {contact_id}")
reply_email = generate_reply_email(contact_email_for_reply, alias)
print(
f"Replacing contact {contact_id} with {website_email} reply_email for {reply_email}"
)
Session.execute(
update_query, {"contact_id": row[0], "reply_email": reply_email}
)
Session.commit()
updated += 1
loop_updated += 1
elapsed = time.time() - start_time
print(f"\rContact {last_id} done")
if loop_updated == 0:
break
print("")

30
poetry.lock generated
View File

@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.6.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.7.0 and should not be changed by hand.
[[package]]
name = "aiohttp"
@ -2831,6 +2831,32 @@ files = [
docs = ["ryd"]
jinja2 = ["ruamel.yaml.jinja2 (>=0.2)"]
[[package]]
name = "ruff"
version = "0.1.5"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
files = [
{file = "ruff-0.1.5-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:32d47fc69261c21a4c48916f16ca272bf2f273eb635d91c65d5cd548bf1f3d96"},
{file = "ruff-0.1.5-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:171276c1df6c07fa0597fb946139ced1c2978f4f0b8254f201281729981f3c17"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17ef33cd0bb7316ca65649fc748acc1406dfa4da96a3d0cde6d52f2e866c7b39"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b2c205827b3f8c13b4a432e9585750b93fd907986fe1aec62b2a02cf4401eee6"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bb408e3a2ad8f6881d0f2e7ad70cddb3ed9f200eb3517a91a245bbe27101d379"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f20dc5e5905ddb407060ca27267c7174f532375c08076d1a953cf7bb016f5a24"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aafb9d2b671ed934998e881e2c0f5845a4295e84e719359c71c39a5363cccc91"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a4894dddb476597a0ba4473d72a23151b8b3b0b5f958f2cf4d3f1c572cdb7af7"},
{file = "ruff-0.1.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a00a7ec893f665ed60008c70fe9eeb58d210e6b4d83ec6654a9904871f982a2a"},
{file = "ruff-0.1.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a8c11206b47f283cbda399a654fd0178d7a389e631f19f51da15cbe631480c5b"},
{file = "ruff-0.1.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:fa29e67b3284b9a79b1a85ee66e293a94ac6b7bb068b307a8a373c3d343aa8ec"},
{file = "ruff-0.1.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9b97fd6da44d6cceb188147b68db69a5741fbc736465b5cea3928fdac0bc1aeb"},
{file = "ruff-0.1.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:721f4b9d3b4161df8dc9f09aa8562e39d14e55a4dbaa451a8e55bdc9590e20f4"},
{file = "ruff-0.1.5-py3-none-win32.whl", hash = "sha256:f80c73bba6bc69e4fdc73b3991db0b546ce641bdcd5b07210b8ad6f64c79f1ab"},
{file = "ruff-0.1.5-py3-none-win_amd64.whl", hash = "sha256:c21fe20ee7d76206d290a76271c1af7a5096bc4c73ab9383ed2ad35f852a0087"},
{file = "ruff-0.1.5-py3-none-win_arm64.whl", hash = "sha256:82bfcb9927e88c1ed50f49ac6c9728dab3ea451212693fe40d08d314663e412f"},
{file = "ruff-0.1.5.tar.gz", hash = "sha256:5cbec0ef2ae1748fb194f420fb03fb2c25c3258c86129af7172ff8f198f125ab"},
]
[[package]]
name = "s3transfer"
version = "0.3.3"
@ -3674,4 +3700,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"]
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
content-hash = "8bf71c74c8f4d1afe6b1ab0912702cdb47086474168bed8a9230c398abf349dd"
content-hash = "01afc410d21eeac0a0ac7e8ef6eeb0a991cf4bc091c3351049263462e205ff63"

45
proto/event.proto Normal file
View File

@ -0,0 +1,45 @@
syntax = "proto3";
package simplelogin_events;
message UserPlanChange {
uint32 plan_end_time = 1;
}
message UserDeleted {
}
message AliasCreated {
uint32 alias_id = 1;
string alias_email = 2;
string alias_note = 3;
bool enabled = 4;
}
message AliasStatusChange {
uint32 alias_id = 1;
string alias_email = 2;
bool enabled = 3;
}
message AliasDeleted {
uint32 alias_id = 1;
string alias_email = 2;
}
message EventContent {
oneof content {
UserPlanChange user_plan_change = 1;
UserDeleted user_deleted = 2;
AliasCreated alias_created = 3;
AliasStatusChange alias_status_change = 4;
AliasDeleted alias_deleted = 5;
}
}
message Event {
uint32 user_id = 1;
string external_user_id = 2;
uint32 partner_id = 3;
EventContent content = 4;
}

Some files were not shown because too many files have changed in this diff Show More