Compare commits

...

71 Commits

Author SHA1 Message Date
Carlos Quintana 9d2a35b9c2
fix: monitoring table name (#2120) 2024-05-24 11:09:10 +02:00
Carlos Quintana 5f190d4b46
fix: monitoring table name 2024-05-24 10:52:08 +02:00
Carlos Quintana 6862ed3602
fix: event listener (#2119)
* fix: commit transaction after taking event

* feat: allow to reconnect to postgres for event listener

* chore: log sync events pending to process to metrics

* fix: make dead_letter runner able to process events without needing to have lock on the event

* chore: close Session after reconnect

* refactor: make EventSource emit only events that can be processed
2024-05-24 10:21:19 +02:00
Carlos Quintana 450322fff1
feat: allow to disable event-webhook (#2118) 2024-05-23 16:50:54 +02:00
Carlos Quintana aad6f59e96
Improve error handling on event sink (#2117)
* chore: make event_sink return success

* fix: add return to ConsoleEventSink
2024-05-23 15:05:47 +02:00
Carlos Quintana 8eccb05e33
feat: implement HTTP event sink (#2116)
* feat: implement HTTP event sink

* Update events/event_sink.py

---------

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-05-23 11:32:45 +02:00
Carlos Quintana 3e0b7bb369
Add sync events (#2113)
* feat: add protocol buffers for events

* chore: add EventDispatcher

* chore: add WebhookEvent class

* chore: emit events

* feat: initial version of event listener

* chore: emit user plan change with new timestamp

* feat: emit metrics + add alias status to create event

* chore: add newrelic decorator to functions

* fix: event emitter fixes

* fix: take null end_time into account

* fix: avoid double-commits

* chore: move UserDeleted event to User.delete method

* db: add index to sync_event created_at and taken_time columns

* chore: add index to model
2024-05-23 10:27:08 +02:00
Son Nguyen Kim 60ab8c15ec
show app page (#2110)
Co-authored-by: Son NK <son@simplelogin.io>
2024-05-22 15:43:36 +02:00
Son Nguyen Kim b5b167479f
Fix admin loop (#2103)
* mailbox page requires sudo

* fix the loop when non-admin user visits an admin URL

https://github.com/simple-login/app/issues/2101

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-05-10 18:52:12 +02:00
Adrià Casajús 8f12fabd81
Make hibp rate configurable (#2105) 2024-05-10 18:51:16 +02:00
Daniel Mühlbachler-Pietrzykowski b6004f3336
feat: use oidc well-known url (#2077) 2024-05-02 16:17:10 +02:00
Adrià Casajús 80c8bc820b
Do not double count AlilasMailboxes with Aliases (#2095)
* Do not double count aliasmailboxes with aliases

* Keep Sl-Queue-id
2024-04-30 16:41:47 +02:00
Son Nguyen Kim 037bc9da36
mailbox page requires sudo (#2094)
Co-authored-by: Son NK <son@simplelogin.io>
2024-04-23 22:25:37 +02:00
Son Nguyen Kim ee0be3688f
Data breach (#2093)
* add User.enable_data_breach_check column

* user can turn on/off the data breach check

* only run data breach check for user who enables it

* add tips to run tests using a local DB (without docker)

* refactor True check

* trim trailing space

* fix test

* Apply suggestions from code review

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>

* format

---------

Co-authored-by: Son NK <son@simplelogin.io>
Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-04-23 22:16:36 +02:00
Adrià Casajús 015036b499
Prevent proton mailboxes from enabling pgp encryption (#2086) 2024-04-12 15:19:41 +02:00
Son Nguyen Kim d5df91aab6
Premium user can enable data breach monitoring (#2084)
* add User.enable_data_breach_check column

* user can turn on/off the data breach check

* only run data breach check for user who enables it

* add tips to run tests using a local DB (without docker)

* refactor True check

* trim trailing space

* fix test

* Apply suggestions from code review

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>

* format

---------

Co-authored-by: Son NK <son@simplelogin.io>
Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-04-12 10:39:23 +02:00
Adrià Casajús 2eb5feaa8f
Small improvements (#2082)
* Update logs with more relevant info for debugging purposes

* Improved logs for alias creation rate-limit

* Reduce sudo time to 120 secs

* log fixes

* Fix missing object to add to the session
2024-04-08 15:05:51 +02:00
Adrià Casajús 3c364da37d
Dmarc fix (#2079)
* Add log to spam check + remove invisible characters on import

* Update log
2024-03-26 11:43:33 +01:00
Adrià Casajús 36cf530ef8
Preserve X-SL-Queue-Id (#2076) 2024-03-22 11:00:06 +01:00
Adrià Casajús 0da1811311
Cleanup old data (#2066)
* Cleanup tasks

* Update

* Added tests

* Create cron job

* Delete old data cron

* Fix import

* import fix

* Added delete + script to disable pgp for proton mboxes
2024-03-18 16:00:21 +01:00
Adrià Casajús f2fcaa6c60
Cleanup also messsage-id headers from linebreaks (#2067) 2024-03-18 14:27:38 +01:00
Adrià Casajús aa2c676b5e
Only check HIBP alias of paid users (#2065) 2024-03-15 10:13:06 +01:00
Adrià Casajús 30ddd4c807
Update oneshot commands (#2064)
* Update oneshot commands

* fix

* Fix test_load command

* Rename to avoid test executing it

* Do HIBP breach check in batches instead of a single load
2024-03-14 16:03:43 +01:00
Son Nguyen Kim f5babd9c81
Move import export back to setting (#2063)
* replace black by ruff

* move alias import/export to settings

* fix html closing tag

* add rate limit for alias import & export

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-03-14 15:56:35 +01:00
Adrià Casajús 74b811dd35
Update oneshot commands (#2060)
* Update oneshot commands

* fix

* Fix test_load command

* Rename to avoid test executing it
2024-03-14 11:11:50 +01:00
martadams89 e6c51bcf20
ci: remove v7 (#2062) 2024-03-14 09:49:45 +01:00
martadams89 4bfc6b9aca
ci: add arm docker images (#2056)
* ci: add arm docker images

* ci: remove armv6

* ci: update workflow to run only on path

* Update .github/workflows/main.yml

---------

Co-authored-by: Adrià Casajús <acasajus@users.noreply.github.com>
2024-03-13 15:20:47 +01:00
Adrià Casajús e96de79665
Add missing indexes and mark aliases as created by partner (#2058)
* Add missing indexes and mark aliases as created by partner

* Configure if we should skip the partner aliases or not
2024-03-13 14:30:17 +01:00
Daniel Mühlbachler-Pietrzykowski a608503df6
feat: add generic OIDC connect (#2046) 2024-03-13 14:30:00 +01:00
Son Nguyen Kim 0c3c6db2ab
point to the new safari extension (#2059)
Co-authored-by: Son NK <son@simplelogin.io>
2024-03-12 23:05:54 +01:00
Adrià Casajús 9719a36dab
Do not replace unsubs that go to UNSUBSCRIBER (#2051) 2024-03-06 16:26:10 +01:00
Adrià Casajús a7d4bd15a7
If the transactional_id is None do nothing 2024-03-04 17:47:06 +01:00
Adrià Casajús 565f6dc142
If there is no transactional id, skip it (#2047) 2024-03-04 17:44:19 +01:00
Adrià Casajús 76423527dd
Update HIBP async script (#2043)
* Update HIBP async script

* Fix: continue instead of return
2024-03-04 13:12:38 +01:00
Adrià Casajús 501b225e40
Require sudo for account changes (#2041)
* Move accounts settings under sudo

* Fixed sudo mode

* Add a log message

* Update test

* Renamed sudo_setting to account_setting

* Moved simple login data export and alias/import export to account settings

* Move account settings to the top-right dropdown
2024-02-29 11:20:29 +01:00
Adrià Casajús 1dada1a4b5
Allow to skip creating transactional emails (#2042) 2024-02-27 16:52:45 +01:00
Adrià Casajús 37f227da42
Fix format 2024-02-27 09:41:47 +01:00
Adrià Casajús 97e68159c5
Fix: Never use NOREPLY to create contacts (#2039) 2024-02-27 09:29:53 +01:00
Adrià Casajús 673e19b287
Sanitize unused next parameter (#2040) 2024-02-26 19:23:03 +01:00
Sukuna 5959d40a00
Added comments to test_login.py (#2035)
* Added comments to test_login.py

-Added comments to each test function to provide clear documentation of the test steps.
-Comments detail the purpose of each test, the actions taken, and the expected outcomes.
-Improved readability and maintainability of the test suite.
-No changes in functionality; only added comments for better code understanding.

* Removed comments from import file in test_login.py
2024-02-26 17:41:54 +01:00
Adrià Casajús 173ae6a221
Allow to soft-delete users (#2034)
* Allow the possibility of soft-deleting users

* Unschedule for delete after link

* Add dry run to the cron
2024-02-22 17:38:34 +01:00
Adrià Casajús eb92823ef8
Removed potentially nsfw words 2024-02-20 11:17:28 +01:00
Adrià Casajús 363b851f61
Fix: use proper bucket time for the rate limit 2024-02-20 11:13:06 +01:00
Adrià Casajús d0a6b8ed79
Add start and end flags to parallelize call (#2033) 2024-02-19 16:46:35 +01:00
Adrià Casajús 50c130a3a3
Store the latest email_log id in the alias to simplify dashboard query (#2022)
* Store the latest email_log id in the alias to simplify dashboard query

* Fix test

* Add script to migrate users last email_log_id to alias

* Always update the alias last_email_log_id automatically

* Only set the alias_id if it is set

* Fix test with randomization

* Fix notification test

* Also remove explicit set on tests

* Rate limit alias creation to prevent abuse (#2021)

* Rate limit alias creation to prevent abuse

* Limit in secs

* Calculate bucket time

* fix exception

* Tune limits

* Move rate limit config to configuration (#2023)

* Fix dropdown item in header (#2024)

* Add option for admin to stop trial (#2026)

* Fix: if redis is not configured do not enable rate limit (#2027)

* support product IDs for the new Mac app (#2028)

Co-authored-by: Son NK <son@simplelogin.io>

* Add metrics to rate limit (#2029)

* Order domains alphabetically when retrieving them (#2030)

* Removed unused import

* Remove debug info

---------

Co-authored-by: D-Bao <49440133+D-Bao@users.noreply.github.com>
Co-authored-by: Son Nguyen Kim <son.nguyen@proton.ch>
Co-authored-by: Son NK <son@simplelogin.io>
2024-02-15 15:48:02 +01:00
Adrià Casajús b462c256d3
Order domains alphabetically when retrieving them (#2030) 2024-02-08 15:36:06 +01:00
Adrià Casajús f756b04ead
Add metrics to rate limit (#2029) 2024-02-06 11:55:45 +01:00
Son Nguyen Kim 05d18c23cc
support product IDs for the new Mac app (#2028)
Co-authored-by: Son NK <son@simplelogin.io>
2024-02-06 11:54:02 +01:00
Adrià Casajús 4a7c0293f8
Fix: if redis is not configured do not enable rate limit (#2027) 2024-02-05 14:53:01 +01:00
Adrià Casajús 30aaf118e7
Add option for admin to stop trial (#2026) 2024-02-05 13:47:39 +01:00
D-Bao 7b0d6dae1b
Fix dropdown item in header (#2024) 2024-02-02 10:23:05 +01:00
Adrià Casajús b6f1cecee9
Move rate limit config to configuration (#2023) 2024-02-01 14:47:15 +01:00
Adrià Casajús d12e776949
Rate limit alias creation to prevent abuse (#2021)
* Rate limit alias creation to prevent abuse

* Limit in secs

* Calculate bucket time

* fix exception

* Tune limits
2024-01-30 18:29:59 +01:00
Ed b8dad2d657
Update README.md (#2011)
Updated README.md to prevent Nginx redirecting the browser to the local address of the machine
2024-01-26 10:30:16 +01:00
Son Nguyen Kim 860ce03f2a
fix footer spacing again (#2018)
Co-authored-by: Son NK <son@simplelogin.io>
2024-01-26 10:27:57 +01:00
Son Nguyen Kim 71bb7bc795
fix space issue on footer (#2017)
Co-authored-by: Son NK <son@simplelogin.io>
2024-01-23 14:57:58 +01:00
Adrià Casajús 761420ece9
Prevent mailboxes that have been disabled from being used again (#2016)
* Prevent mailboxes that have been disabled from being used again

* Improve test

* Get one user since it will be unique
2024-01-23 14:57:40 +01:00
Adrià Casajús c3848862c3
Fix: limit the id sizes we generate and remove spaces after unidecode 2024-01-22 17:42:58 +01:00
Adrià Casajús da09db3864
Do not allow free users to create reverse alias to reduce abuse (#2013)
* Do not allow free users to create reverse alias to reduce abuse

* Update format

* Move function under user

* Update tests
2024-01-16 14:51:01 +01:00
Adrià Casajús 44138e25a5
Fix: Dedup the list of mailboxes for an alias (#2010) 2024-01-16 14:50:39 +01:00
Adrià Casajús b541ca4ceb
Fix typo in email (#2008) 2024-01-10 10:30:16 +01:00
Revi99 66c18e2f8e
small fixes (#2001)
* add forum mention

* add forum mention

* Add forum mention

* add forum mention

* fix my mistake

* fix
2024-01-08 21:40:52 +01:00
Son Nguyen Kim 4a046c5f6f
fix error when user logs out, go back to /dashboard and has the server error (#2003)
* fix error when user logs out, go back to /dashboard and has the server error

* reformat files. Not run ruff on migrations/ and .venv

---------

Co-authored-by: Son NK <son@simplelogin.io>
2024-01-05 14:30:07 +01:00
Ueri8 a731bf4435
Small fixes due to SL moving to Switzerland (#1999)
* Fix footer due to recent changes

* Fix forum mention
2024-01-04 12:07:59 +01:00
SecurityGuy f3127dc857
Generate working DKIM keys by adding -traditional flag and update NGINX instructions to avoid breaking certbot (#1989)
* Update README.md

Add -traditional option to openssl genrsa to avoid Python DKIM library (dkimpy) error that prevents email from being sent:

dkim.asn1.ASN1FormatError: Unexpected tag (got 30, expecting 02)

Ref: https://bugs.launchpad.net/dkimpy/+bug/1708917

* Update NGINX instructions

Include warning to delete /etc/nginx/sites-enabled/default to avoid a conflict that breaks certbot.
2024-01-03 14:08:39 +01:00
Kelp8 d9d28d3c75
I don’t think we really need that (#1992) 2024-01-03 14:08:05 +01:00
Kelp8 bca6bfa617
Remove sensitive words (#1994) 2024-01-03 12:38:13 +01:00
Agent-XD 5d6a4963a0
Small fixes (#1991)
* remove proprietary mention

* Add forum mention

* Sync activation.txt with activation.html

* Add subdomain information

* Make info look better

* Fix wording
2024-01-03 12:35:42 +01:00
Kelp8 00737f68de
Minor wordings change (#1985)
* Wording changes

* Add information to avoid being put in SPAM

* Remove word repeating

* Add forum mention

* Add forum mention to header.html

* Add info to avoid person marking as SPAM
2024-01-02 13:20:48 +01:00
Joseph Demcher 9ae206ec77
correct alias version in api.md (#1981) 2024-01-02 12:35:56 +01:00
Agent-XD 9452b14e10
Remove sensitive words (#1983)
* Remove sensitive words from test_words.txt

* Remove sensitive from words.txt

* remove sensitive from words_alpha.txt

* Update test_words.txt
2024-01-02 12:33:27 +01:00
126 changed files with 3493 additions and 370953 deletions

View File

@ -1,7 +1,6 @@
name: Test and lint
on:
push:
on: [push, pull_request]
jobs:
lint:
@ -139,6 +138,12 @@ jobs:
with:
fetch-depth: 0
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Create Sentry release
uses: getsentry/action-release@v1
env:
@ -158,6 +163,7 @@ jobs:
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}

View File

@ -68,6 +68,12 @@ For most tests, you will need to have ``redis`` installed and started on your ma
sh scripts/run-test.sh
```
You can also run tests using a local Postgres DB to speed things up. This can be done by
- creating an empty test DB and running the database migration by `dropdb test && createdb test && DB_URI=postgresql://localhost:5432/test alembic upgrade head`
- replacing the `DB_URI` in `test.env` file by `DB_URI=postgresql://localhost:5432/test`
## Run the code locally
Install npm packages
@ -151,10 +157,10 @@ Here are the small sum-ups of the directory structures and their roles:
## Pull request
The code is formatted using https://github.com/psf/black, to format the code, simply run
The code is formatted using [ruff](https://github.com/astral-sh/ruff), to format the code, simply run
```
poetry run black .
poetry run ruff format .
```
The code is also checked with `flake8`, make sure to run `flake8` before creating the pull request by

View File

@ -74,7 +74,7 @@ Setting up DKIM is highly recommended to reduce the chance your emails ending up
First you need to generate a private and public key for DKIM:
```bash
openssl genrsa -out dkim.key 1024
openssl genrsa -out dkim.key -traditional 1024
openssl rsa -in dkim.key -pubout -out dkim.pub.key
```
@ -510,11 +510,14 @@ server {
server_name app.mydomain.com;
location / {
proxy_pass http://localhost:7777;
proxy_pass http://localhost:7777;
proxy_set_header Host $host;
}
}
```
Note: If `/etc/nginx/sites-enabled/default` exists, delete it or certbot will fail due to the conflict. The `simplelogin` file should be the only file in `sites-enabled`.
Reload Nginx with the command below
```bash

View File

@ -168,6 +168,8 @@ class NewUserStrategy(ClientMergeStrategy):
class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
# IF it was scheduled to be deleted. Unschedule it.
self.user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
self.link_request, self.user, self.partner
)
@ -246,6 +248,8 @@ def link_user(
) -> LinkResult:
# Sanitize email just in case
link_request.email = sanitize_email(link_request.email)
# If it was scheduled to be deleted. Unschedule it.
current_user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
link_request, current_user, partner
)

View File

@ -46,7 +46,8 @@ class SLModelView(sqla.ModelView):
def inaccessible_callback(self, name, **kwargs):
# redirect to login page if user doesn't have access
return redirect(url_for("auth.login", next=request.url))
flash("You don't have access to the admin page", "error")
return redirect(url_for("dashboard.index", next=request.url))
def on_model_change(self, form, model, is_created):
changes = {}
@ -214,6 +215,20 @@ class UserAdmin(SLModelView):
Session.commit()
@action(
"remove trial",
"Stop trial period",
"Remove trial for this user?",
)
def stop_trial(self, ids):
for user in User.filter(User.id.in_(ids)):
user.trial_end = None
flash(f"Stopped trial for {user}", "success")
AdminAuditLog.stop_trial(current_user.id, user.id)
Session.commit()
@action(
"disable_otp_fido",
"Disable OTP & FIDO",

View File

@ -25,6 +25,8 @@ from app.email_utils import (
render,
)
from app.errors import AliasInTrashError
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasDeleted, AliasStatusChange, EventContent
from app.log import LOG
from app.models import (
Alias,
@ -308,31 +310,36 @@ def delete_alias(alias: Alias, user: User):
Delete an alias and add it to either global or domain trash
Should be used instead of Alias.delete, DomainDeletedAlias.create, DeletedAlias.create
"""
# save deleted alias to either global or domain trash
LOG.i(f"User {user} has deleted alias {alias}")
# save deleted alias to either global or domain tra
if alias.custom_domain_id:
if not DomainDeletedAlias.get_by(
email=alias.email, domain_id=alias.custom_domain_id
):
LOG.d("add %s to domain %s trash", alias, alias.custom_domain_id)
Session.add(
DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
)
domain_deleted_alias = DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
)
Session.add(domain_deleted_alias)
Session.commit()
LOG.i(
f"Moving {alias} to domain {alias.custom_domain_id} trash {domain_deleted_alias}"
)
else:
if not DeletedAlias.get_by(email=alias.email):
LOG.d("add %s to global trash", alias)
Session.add(DeletedAlias(email=alias.email))
deleted_alias = DeletedAlias(email=alias.email)
Session.add(deleted_alias)
Session.commit()
LOG.i(f"Moving {alias} to global trash {deleted_alias}")
LOG.i("delete alias %s", alias)
Alias.filter(Alias.id == alias.id).delete()
Session.commit()
EventDispatcher.send_event(
user, EventContent(alias_deleted=AliasDeleted(alias_id=alias.id))
)
def aliases_for_mailbox(mailbox: Mailbox) -> [Alias]:
"""
@ -458,3 +465,15 @@ def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]):
alias.pinned = False
Session.commit()
def change_alias_status(alias: Alias, enabled: bool, commit: bool = False):
alias.enabled = enabled
event = AliasStatusChange(
alias_id=alias.id, alias_email=alias.email, enabled=enabled
)
EventDispatcher.send_event(alias.user, EventContent(alias_status_change=event))
if commit:
Session.commit()

View File

@ -33,6 +33,9 @@ def authorize_request() -> Optional[Tuple[str, int]]:
if g.user.disabled:
return jsonify(error="Disabled account"), 403
if not g.user.is_active():
return jsonify(error="Account does not exist"), 401
g.api_key = api_key
return None

View File

@ -201,10 +201,10 @@ def get_alias_infos_with_pagination_v3(
q = q.order_by(Alias.pinned.desc())
q = q.order_by(latest_activity.desc())
q = list(q.limit(page_limit).offset(page_id * page_size))
q = q.limit(page_limit).offset(page_id * page_size)
ret = []
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in q:
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in list(q):
ret.append(
AliasInfo(
alias=alias,
@ -358,7 +358,6 @@ def construct_alias_query(user: User):
else_=0,
)
).label("nb_forward"),
func.max(EmailLog.created_at).label("latest_email_log_created_at"),
)
.join(EmailLog, Alias.id == EmailLog.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
@ -366,14 +365,6 @@ def construct_alias_query(user: User):
.subquery()
)
alias_contact_subquery = (
Session.query(Alias.id, func.max(Contact.id).label("max_contact_id"))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
.group_by(Alias.id)
.subquery()
)
return (
Session.query(
Alias,
@ -385,23 +376,7 @@ def construct_alias_query(user: User):
)
.options(joinedload(Alias.hibp_breaches))
.options(joinedload(Alias.custom_domain))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.join(EmailLog, Contact.id == EmailLog.contact_id, isouter=True)
.join(EmailLog, Alias.last_email_log_id == EmailLog.id, isouter=True)
.join(Contact, EmailLog.contact_id == Contact.id, isouter=True)
.filter(Alias.id == alias_activity_subquery.c.id)
.filter(Alias.id == alias_contact_subquery.c.id)
.filter(
or_(
EmailLog.created_at
== alias_activity_subquery.c.latest_email_log_created_at,
and_(
# no email log yet for this alias
alias_activity_subquery.c.latest_email_log_created_at.is_(None),
# to make sure only 1 contact is returned in this case
or_(
Contact.id == alias_contact_subquery.c.max_contact_id,
alias_contact_subquery.c.max_contact_id.is_(None),
),
),
)
)
)

View File

@ -184,7 +184,7 @@ def toggle_alias(alias_id):
if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403
alias.enabled = not alias.enabled
alias_utils.change_alias_status(alias, enabled=not alias.enabled)
Session.commit()
return jsonify(enabled=alias.enabled), 200

View File

@ -17,9 +17,14 @@ from app.models import PlanEnum, AppleSubscription
_MONTHLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.monthly"
_YEARLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.yearly"
# SL Mac app used to be in SL account
_MACAPP_MONTHLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.yearly"
# SL Mac app is moved to Proton account
_MACAPP_MONTHLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.yearly"
# Apple API URL
_SANDBOX_URL = "https://sandbox.itunes.apple.com/verifyReceipt"
_PROD_URL = "https://buy.itunes.apple.com/verifyReceipt"
@ -263,7 +268,11 @@ def apple_update_notification():
plan = (
PlanEnum.monthly
if transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)
@ -517,7 +526,11 @@ def verify_receipt(receipt_data, user, password) -> Optional[AppleSubscription]:
plan = (
PlanEnum.monthly
if latest_transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)

View File

@ -11,7 +11,7 @@ from itsdangerous import Signer
from app import email_utils
from app.api.base import api_bp
from app.config import FLASK_SECRET, DISABLE_REGISTRATION
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,

View File

@ -32,6 +32,7 @@ def user_to_dict(user: User) -> dict:
"in_trial": user.in_trial(),
"max_alias_free_plan": user.max_alias_for_free_account(),
"connected_proton_address": None,
"can_create_reverse_alias": user.can_create_contacts(),
}
if config.CONNECT_WITH_PROTON:
@ -58,6 +59,7 @@ def user_info():
- in_trial
- max_alias_free
- is_connected_with_proton
- can_create_reverse_alias
"""
user = g.user

View File

@ -16,6 +16,7 @@ from .views import (
social,
recovery,
api_to_cookie,
oidc,
)
__all__ = [
@ -36,4 +37,5 @@ __all__ = [
"social",
"recovery",
"api_to_cookie",
"oidc",
]

View File

@ -3,10 +3,13 @@ from flask_login import login_user
from app.auth.base import auth_bp
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import EmailChange, ResetPasswordCode
@auth_bp.route("/change_email", methods=["GET", "POST"])
@limiter.limit("3/hour")
def change_email():
code = request.args.get("code")
@ -22,12 +25,14 @@ def change_email():
return render_template("auth/change_email.html")
user = email_change.user
old_email = user.email
user.email = email_change.new_email
EmailChange.delete(email_change.id)
ResetPasswordCode.filter_by(user_id=user.id).delete()
Session.commit()
LOG.i(f"User {user} has changed their email from {old_email} to {user.email}")
flash("Your new email has been updated", "success")
login_user(user)

View File

@ -3,7 +3,7 @@ from flask_wtf import FlaskForm
from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.extensions import limiter
from app.log import LOG
from app.models import User

View File

@ -7,7 +7,7 @@ from app.config import URL, GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET
from app.db import Session
from app.log import LOG
from app.models import User, File, SocialAuth
from app.utils import random_string, sanitize_email
from app.utils import random_string, sanitize_email, sanitize_next_url
from .login_utils import after_login
_authorization_base_url = "https://accounts.google.com/o/oauth2/v2/auth"
@ -29,7 +29,7 @@ def google_login():
# to avoid flask-login displaying the login error message
session.pop("_flashes", None)
next_url = request.args.get("next")
next_url = sanitize_next_url(request.args.get("next"))
# Google does not allow to append param to redirect_url
# we need to pass the next url by session

View File

@ -5,7 +5,7 @@ from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON, OIDC_CLIENT_ID
from app.events.auth_event import LoginEvent
from app.extensions import limiter
from app.log import LOG
@ -77,4 +77,6 @@ def login():
next_url=next_url,
show_resend_activation=show_resend_activation,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

135
app/auth/views/oidc.py Normal file
View File

@ -0,0 +1,135 @@
from flask import request, session, redirect, flash, url_for
from requests_oauthlib import OAuth2Session
import requests
from app import config
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import (
URL,
OIDC_SCOPES,
OIDC_NAME_FIELD,
)
from app.db import Session
from app.email_utils import send_welcome_email
from app.log import LOG
from app.models import User, SocialAuth
from app.utils import sanitize_email, sanitize_next_url
# need to set explicitly redirect_uri instead of leaving the lib to pre-fill redirect_uri
# when served behind nginx, the redirect_uri is localhost... and not the real url
redirect_uri = URL + "/auth/oidc/callback"
SESSION_STATE_KEY = "oauth_state"
SESSION_NEXT_KEY = "oauth_redirect_next"
@auth_bp.route("/oidc/login")
def oidc_login():
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
next_url = sanitize_next_url(request.args.get("next"))
auth_url = requests.get(config.OIDC_WELL_KNOWN_URL).json()["authorization_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID, scope=[OIDC_SCOPES], redirect_uri=redirect_uri
)
authorization_url, state = oidc.authorization_url(auth_url)
# State is used to prevent CSRF, keep this for later.
session[SESSION_STATE_KEY] = state
session[SESSION_NEXT_KEY] = next_url
return redirect(authorization_url)
@auth_bp.route("/oidc/callback")
def oidc_callback():
if SESSION_STATE_KEY not in session:
flash("Invalid state, please retry", "error")
return redirect(url_for("auth.login"))
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
# user clicks on cancel
if "error" in request.args:
flash("Please use another sign in method then", "warning")
return redirect("/")
oidc_configuration = requests.get(config.OIDC_WELL_KNOWN_URL).json()
user_info_url = oidc_configuration["userinfo_endpoint"]
token_url = oidc_configuration["token_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID,
state=session[SESSION_STATE_KEY],
scope=[OIDC_SCOPES],
redirect_uri=redirect_uri,
)
oidc.fetch_token(
token_url,
client_secret=config.OIDC_CLIENT_SECRET,
authorization_response=request.url,
)
oidc_user_data = oidc.get(user_info_url)
if oidc_user_data.status_code != 200:
LOG.e(
f"cannot get oidc user data {oidc_user_data.status_code} {oidc_user_data.text}"
)
flash(
"Cannot get user data from OIDC, please use another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
oidc_user_data = oidc_user_data.json()
email = oidc_user_data.get("email")
if not email:
LOG.e(f"cannot get email for OIDC user {oidc_user_data} {email}")
flash(
"Cannot get a valid email from OIDC, please another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
email = sanitize_email(email)
user = User.get_by(email=email)
if not user and config.DISABLE_REGISTRATION:
flash(
"Sorry you cannot sign up via the OIDC provider. Please sign-up first with your email.",
"error",
)
return redirect(url_for("auth.register"))
elif not user:
user = create_user(email, oidc_user_data)
if not SocialAuth.get_by(user_id=user.id, social="oidc"):
SocialAuth.create(user_id=user.id, social="oidc")
Session.commit()
# The activation link contains the original page, for ex authorize page
next_url = session[SESSION_NEXT_KEY]
session[SESSION_NEXT_KEY] = None
return after_login(user, next_url)
def create_user(email, oidc_user_data):
new_user = User.create(
email=email,
name=oidc_user_data.get(OIDC_NAME_FIELD),
password="",
activated=True,
)
LOG.i(f"Created new user for login request from OIDC. New user {new_user.id}")
Session.commit()
send_welcome_email(new_user)
return new_user

View File

@ -6,7 +6,7 @@ from wtforms import StringField, validators
from app import email_utils, config
from app.auth.base import auth_bp
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON
from app.auth.views.login_utils import get_referral
from app.config import URL, HCAPTCHA_SECRET, HCAPTCHA_SITEKEY
from app.db import Session
@ -109,6 +109,8 @@ def register():
next_url=next_url,
HCAPTCHA_SITEKEY=HCAPTCHA_SITEKEY,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=config.OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -234,7 +234,7 @@ else:
print("WARNING: Use a temp directory for GNUPGHOME", GNUPGHOME)
# Github, Google, Facebook client id and secrets
# Github, Google, Facebook, OIDC client id and secrets
GITHUB_CLIENT_ID = os.environ.get("GITHUB_CLIENT_ID")
GITHUB_CLIENT_SECRET = os.environ.get("GITHUB_CLIENT_SECRET")
@ -244,6 +244,13 @@ GOOGLE_CLIENT_SECRET = os.environ.get("GOOGLE_CLIENT_SECRET")
FACEBOOK_CLIENT_ID = os.environ.get("FACEBOOK_CLIENT_ID")
FACEBOOK_CLIENT_SECRET = os.environ.get("FACEBOOK_CLIENT_SECRET")
CONNECT_WITH_OIDC_ICON = os.environ.get("CONNECT_WITH_OIDC_ICON")
OIDC_WELL_KNOWN_URL = os.environ.get("OIDC_WELL_KNOWN_URL")
OIDC_CLIENT_ID = os.environ.get("OIDC_CLIENT_ID")
OIDC_CLIENT_SECRET = os.environ.get("OIDC_CLIENT_SECRET")
OIDC_SCOPES = os.environ.get("OIDC_SCOPES")
OIDC_NAME_FIELD = os.environ.get("OIDC_NAME_FIELD", "name")
PROTON_CLIENT_ID = os.environ.get("PROTON_CLIENT_ID")
PROTON_CLIENT_SECRET = os.environ.get("PROTON_CLIENT_SECRET")
PROTON_BASE_URL = os.environ.get(
@ -421,6 +428,11 @@ try:
except Exception:
HIBP_SCAN_INTERVAL_DAYS = 7
HIBP_API_KEYS = sl_getenv("HIBP_API_KEYS", list) or []
HIBP_MAX_ALIAS_CHECK = 10_000
HIBP_RPM = int(os.environ.get("HIBP_API_RPM", 100))
HIBP_SKIP_PARTNER_ALIAS = os.environ.get("HIBP_SKIP_PARTNER_ALIAS")
KEEP_OLD_DATA_DAYS = 30
POSTMASTER = os.environ.get("POSTMASTER")
@ -489,7 +501,34 @@ def setup_nameservers():
NAMESERVERS = setup_nameservers()
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = False
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = os.environ.get(
"DISABLE_CREATE_CONTACTS_FOR_FREE_USERS", False
)
# Expect format hits,seconds:hits,seconds...
# Example 1,10:4,60 means 1 in the last 10 secs or 4 in the last 60 secs
def getRateLimitFromConfig(
env_var: string, default: string = ""
) -> list[tuple[int, int]]:
value = os.environ.get(env_var, default)
if not value:
return []
entries = [entry for entry in value.split(":")]
limits = []
for entry in entries:
fields = entry.split(",")
limit = (int(fields[0]), int(fields[1]))
limits.append(limit)
return limits
ALIAS_CREATE_RATE_LIMIT_FREE = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_FREE", "10,900:50,3600"
)
ALIAS_CREATE_RATE_LIMIT_PAID = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_PAID", "50,900:200,3600"
)
PARTNER_API_TOKEN_SECRET = os.environ.get("PARTNER_API_TOKEN_SECRET") or (
FLASK_SECRET + "partnerapitoken"
)
@ -540,3 +579,11 @@ MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30))
UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None)
UPCLOUD_PASSWORD = os.environ.get("UPCLOUD_PASSWORD", None)
UPCLOUD_DB_ID = os.environ.get("UPCLOUD_DB_ID", None)
STORE_TRANSACTIONAL_EMAILS = "STORE_TRANSACTIONAL_EMAILS" in os.environ
EVENT_WEBHOOK = os.environ.get("EVENT_WEBHOOK", None)
# We want it disabled by default, so only skip if defined
EVENT_WEBHOOK_SKIP_VERIFY_SSL = "EVENT_WEBHOOK_SKIP_VERIFY_SSL" in os.environ
EVENT_WEBHOOK_DISABLE = "EVENT_WEBHOOK_DISABLE" in os.environ

View File

@ -32,6 +32,7 @@ from .views import (
delete_account,
notification,
support,
account_setting,
)
__all__ = [
@ -68,4 +69,5 @@ __all__ = [
"delete_account",
"notification",
"support",
"account_setting",
]

View File

@ -0,0 +1,242 @@
import arrow
from flask import (
render_template,
request,
redirect,
url_for,
flash,
)
from flask_login import login_required, current_user
from app import email_utils
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.dashboard.views.mailbox_detail import ChangeEmailForm
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.extensions import limiter
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
ResetPasswordCode,
EmailChange,
User,
Alias,
AliasGeneratorEnum,
SenderFormatEnum,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import perform_proton_account_unlink
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@dashboard_bp.route("/account_setting", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("5/minute", methods=["POST"])
def account_setting():
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
pending_email = email_change.new_email
else:
pending_email = None
if request.method == "POST":
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
partner_sub = None
partner_name = None
return render_template(
"dashboard/account_setting.html",
csrf_form=csrf_form,
PlanEnum=PlanEnum,
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
partner_sub=partner_sub,
partner_name=partner_name,
FIRST_ALIAS_DOMAIN=FIRST_ALIAS_DOMAIN,
ALIAS_RAND_SUFFIX_LENGTH=ALIAS_RANDOM_SUFFIX_LENGTH,
connect_with_proton=CONNECT_WITH_PROTON,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user.email, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(email_change.new_email, user.email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
@sudo_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
@sudo_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
@sudo_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -51,14 +51,6 @@ def email_validator():
return _check
def user_can_create_contacts(user: User) -> bool:
if user.is_premium():
return True
if user.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
"""
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
@ -82,7 +74,7 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
if contact:
raise ErrContactAlreadyExists(contact)
if not user_can_create_contacts(user):
if not user.can_create_contacts():
raise ErrContactErrorUpgradeNeeded()
contact = Contact.create(
@ -327,6 +319,6 @@ def alias_contact_manager(alias_id):
last_page=last_page,
query=query,
nb_contact=nb_contact,
can_create_contacts=user_can_create_contacts(current_user),
can_create_contacts=current_user.can_create_contacts(),
csrf_form=csrf_form,
)

View File

@ -1,9 +1,13 @@
from app.dashboard.base import dashboard_bp
from flask_login import login_required, current_user
from app.alias_utils import alias_export_csv
from app.dashboard.views.enter_sudo import sudo_required
from app.extensions import limiter
@dashboard_bp.route("/alias_export", methods=["GET"])
@login_required
@sudo_required
@limiter.limit("2/minute")
def alias_export_route():
return alias_export_csv(current_user)

View File

@ -5,7 +5,9 @@ from flask_login import login_required, current_user
from app import s3
from app.config import JOB_BATCH_IMPORT
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import File, BatchImport, Job
from app.utils import random_string, CSRFValidationForm
@ -13,6 +15,8 @@ from app.utils import random_string, CSRFValidationForm
@dashboard_bp.route("/batch_import", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("10/minute", methods=["POST"])
def batch_import_route():
# only for users who have custom domains
if not current_user.verified_custom_domains():
@ -37,7 +41,7 @@ def batch_import_route():
return redirect(request.url)
if len(batch_imports) > 10:
flash(
"You have too many imports already. Wait until some get cleaned up",
"You have too many imports already. Please wait until some get cleaned up",
"error",
)
return render_template(

View File

@ -6,15 +6,15 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from wtforms import PasswordField, validators
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, OIDC_CLIENT_ID, CONNECT_WITH_OIDC_ICON
from app.dashboard.base import dashboard_bp
from app.extensions import limiter
from app.log import LOG
from app.models import PartnerUser
from app.models import PartnerUser, SocialAuth
from app.proton.utils import get_proton_partner
from app.utils import sanitize_next_url
_SUDO_GAP = 900
_SUDO_GAP = 120
class LoginForm(FlaskForm):
@ -51,11 +51,19 @@ def enter_sudo():
if not partner_user or partner_user.partner_id != get_proton_partner().id:
proton_enabled = False
oidc_enabled = OIDC_CLIENT_ID is not None
if oidc_enabled:
oidc_enabled = (
SocialAuth.get_by(user_id=current_user.id, social="oidc") is not None
)
return render_template(
"dashboard/enter_sudo.html",
password_check_form=password_check_form,
next=request.args.get("next"),
connect_with_proton=proton_enabled,
connect_with_oidc=oidc_enabled,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -52,13 +52,13 @@ def get_stats(user: User) -> Stats:
@dashboard_bp.route("/", methods=["GET", "POST"])
@login_required
@limiter.limit(
ALIAS_LIMIT,
methods=["POST"],
exempt_when=lambda: request.form.get("form-name") != "create-random-email",
)
@limiter.limit("10/minute", methods=["GET"], key_func=lambda: current_user.id)
@login_required
@parallel_limiter.lock(
name="alias_creation",
only_when=lambda: request.form.get("form-name") == "create-random-email",
@ -141,12 +141,12 @@ def index():
)
if request.form.get("form-name") == "delete-alias":
LOG.d("delete alias %s", alias)
LOG.i(f"User {current_user} requested deletion of alias {alias}")
email = alias.email
alias_utils.delete_alias(alias, current_user)
flash(f"Alias {email} has been deleted", "success")
elif request.form.get("form-name") == "disable-alias":
alias.enabled = False
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
flash(f"Alias {alias.email} has been disabled", "success")

View File

@ -11,9 +11,11 @@ from wtforms.fields.html5 import EmailField
from app.config import ENFORCE_SPF, MAILBOX_SECRET
from app.config import URL
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.email_utils import email_can_be_used_as_mailbox
from app.email_utils import mailbox_already_used, render, send_email
from app.extensions import limiter
from app.log import LOG
from app.models import Alias, AuthorizedAddress
from app.models import Mailbox
@ -29,6 +31,8 @@ class ChangeEmailForm(FlaskForm):
@dashboard_bp.route("/mailbox/<int:mailbox_id>/", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("20/minute", methods=["POST"])
def mailbox_detail_route(mailbox_id):
mailbox: Mailbox = Mailbox.get(mailbox_id)
if not mailbox or mailbox.user_id != current_user.id:
@ -179,8 +183,15 @@ def mailbox_detail_route(mailbox_id):
elif request.form.get("form-name") == "toggle-pgp":
if request.form.get("pgp-enabled") == "on":
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "success")
if mailbox.is_proton():
mailbox.disable_pgp = True
flash(
"Enabling PGP for a Proton Mail mailbox is redundant and does not add any security benefit",
"info",
)
else:
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "info")
else:
mailbox.disable_pgp = True
flash(f"PGP is disabled on {mailbox.email}", "info")

View File

@ -13,34 +13,24 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from flask_wtf.file import FileField
from wtforms import StringField, validators
from wtforms.fields.html5 import EmailField
from app import s3, email_utils
from app import s3
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.errors import ProtonPartnerNotSetUp
from app.extensions import limiter
from app.image_validation import detect_image_format, ImageFormat
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
File,
ResetPasswordCode,
EmailChange,
User,
Alias,
CustomDomain,
AliasGeneratorEnum,
AliasSuffixEnum,
@ -53,11 +43,10 @@ from app.models import (
PartnerSubscription,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import get_proton_partner, perform_proton_account_unlink
from app.proton.utils import get_proton_partner
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@ -66,12 +55,6 @@ class SettingForm(FlaskForm):
profile_picture = FileField("Profile Picture")
class ChangeEmailForm(FlaskForm):
email = EmailField(
"email", validators=[validators.DataRequired(), validators.Email()]
)
class PromoCodeForm(FlaskForm):
code = StringField("Name", validators=[validators.DataRequired()])
@ -109,7 +92,6 @@ def get_partner_subscription_and_name(
def setting():
form = SettingForm()
promo_form = PromoCodeForm()
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
@ -122,63 +104,7 @@ def setting():
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-profile":
if form.validate():
profile_updated = False
@ -222,15 +148,6 @@ def setting():
if profile_updated:
flash("Your profile has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "notification-preference":
choose = request.form.get("notification")
if choose == "on":
@ -240,7 +157,6 @@ def setting():
Session.commit()
flash("Your notification preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-alias-generator":
scheme = int(request.form.get("alias-generator-scheme"))
if AliasGeneratorEnum.has_value(scheme):
@ -248,7 +164,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-random-alias-default-domain":
default_domain = request.form.get("random-alias-default-domain")
@ -287,7 +202,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "random-alias-suffix":
scheme = int(request.form.get("random-alias-suffix-generator"))
if AliasSuffixEnum.has_value(scheme):
@ -295,7 +209,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-sender-format":
sender_format = int(request.form.get("sender-format"))
if SenderFormatEnum.has_value(sender_format):
@ -305,7 +218,6 @@ def setting():
flash("Your sender format preference has been updated", "success")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "replace-ra":
choose = request.form.get("replace-ra")
if choose == "on":
@ -315,7 +227,21 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "enable_data_breach_check":
if not current_user.is_premium():
flash("Only premium plan can enable data breach monitoring", "warning")
return redirect(url_for("dashboard.setting"))
choose = request.form.get("enable_data_breach_check")
if choose == "on":
LOG.i("User {current_user} has enabled data breach monitoring")
current_user.enable_data_breach_check = True
flash("Data breach monitoring is enabled", "success")
else:
LOG.i("User {current_user} has disabled data breach monitoring")
current_user.enable_data_breach_check = False
flash("Data breach monitoring is disabled", "info")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "sender-in-ra":
choose = request.form.get("enable")
if choose == "on":
@ -325,7 +251,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "expand-alias-info":
choose = request.form.get("enable")
if choose == "on":
@ -387,14 +312,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
manual_sub = ManualSubscription.get_by(user_id=current_user.id)
apple_sub = AppleSubscription.get_by(user_id=current_user.id)
@ -417,7 +334,6 @@ def setting():
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
promo_form=promo_form,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
@ -432,85 +348,3 @@ def setting():
connect_with_proton=CONNECT_WITH_PROTON,
proton_linked_account=proton_linked_account,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user.email, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(email_change.new_email, user.email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -8,6 +8,7 @@ from app.db import Session
from flask import redirect, url_for, flash, request, render_template
from flask_login import login_required, current_user
from app import alias_utils
from app.dashboard.base import dashboard_bp
from app.handler.unsubscribe_encoder import UnsubscribeAction
from app.handler.unsubscribe_handler import UnsubscribeHandler
@ -31,7 +32,7 @@ def unsubscribe(alias_id):
# automatic unsubscribe, according to https://tools.ietf.org/html/rfc8058
if request.method == "POST":
alias.enabled = False
alias_utils.change_alias_status(alias, False)
flash(f"Alias {alias.email} has been blocked", "success")
Session.commit()

View File

@ -21,6 +21,7 @@ LIST_UNSUBSCRIBE = "List-Unsubscribe"
LIST_UNSUBSCRIBE_POST = "List-Unsubscribe-Post"
RETURN_PATH = "Return-Path"
AUTHENTICATION_RESULTS = "Authentication-Results"
SL_QUEUE_ID = "X-SL-Queue-Id"
# headers used to DKIM sign in order of preference
DKIM_HEADERS = [

View File

@ -494,9 +494,10 @@ def delete_header(msg: Message, header: str):
def sanitize_header(msg: Message, header: str):
"""remove trailing space and remove linebreak from a header"""
header_lowercase = header.lower()
for i in reversed(range(len(msg._headers))):
header_name = msg._headers[i][0].lower()
if header_name == header.lower():
if header_name == header_lowercase:
# msg._headers[i] is a tuple like ('From', 'hey@google.com')
if msg._headers[i][1]:
msg._headers[i] = (
@ -583,6 +584,26 @@ def email_can_be_used_as_mailbox(email_address: str) -> bool:
LOG.d("MX Domain %s %s is invalid mailbox domain", mx_domain, domain)
return False
existing_user = User.get_by(email=email_address)
if existing_user and existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled. {email_address} cannot be used for other mailbox"
)
return False
for existing_user in (
User.query()
.join(Mailbox, User.id == Mailbox.user_id)
.filter(Mailbox.email == email_address)
.group_by(User.id)
.all()
):
if existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled and has a mailbox with {email_address}. Id cannot be used for other mailbox"
)
return False
return True
@ -1383,7 +1404,7 @@ def generate_verp_email(
# Time is in minutes granularity and start counting on 2022-01-01 to reduce bytes to represent time
data = [
verp_type.value,
object_id,
object_id or 0,
int((time.time() - VERP_TIME_START) / 60),
]
json_payload = json.dumps(data).encode("utf-8")

0
app/events/__init__.py Normal file
View File

View File

@ -0,0 +1,66 @@
from abc import ABC, abstractmethod
from app import config
from app.db import Session
from app.errors import ProtonPartnerNotSetUp
from app.events.generated import event_pb2
from app.models import User, PartnerUser, SyncEvent
from app.proton.utils import get_proton_partner
from typing import Optional
NOTIFICATION_CHANNEL = "simplelogin_sync_events"
class Dispatcher(ABC):
@abstractmethod
def send(self, event: bytes):
pass
class PostgresDispatcher(Dispatcher):
def send(self, event: bytes):
instance = SyncEvent.create(content=event, flush=True)
Session.execute(f"NOTIFY {NOTIFICATION_CHANNEL}, '{instance.id}';")
@staticmethod
def get():
return PostgresDispatcher()
class EventDispatcher:
@staticmethod
def send_event(
user: User,
content: event_pb2.EventContent,
dispatcher: Dispatcher = PostgresDispatcher.get(),
skip_if_webhook_missing: bool = True,
):
if config.EVENT_WEBHOOK_DISABLE:
return
if not config.EVENT_WEBHOOK and skip_if_webhook_missing:
return
partner_user = EventDispatcher.__partner_user(user.id)
if not partner_user:
return
event = event_pb2.Event(
user_id=user.id,
external_user_id=partner_user.external_user_id,
partner_id=partner_user.partner_id,
content=content,
)
serialized = event.SerializeToString()
dispatcher.send(serialized)
@staticmethod
def __partner_user(user_id: int) -> Optional[PartnerUser]:
# Check if the current user has a partner_id
try:
proton_partner_id = get_proton_partner().id
except ProtonPartnerNotSetUp:
return None
# It has. Retrieve the information for the PartnerUser
return PartnerUser.get_by(user_id=user_id, partner_id=proton_partner_id)

View File

@ -0,0 +1,38 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: event.proto
# Protobuf Python Version: 5.26.1
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x0b\x65vent.proto\x12\x12simplelogin_events\"\'\n\x0eUserPlanChange\x12\x15\n\rplan_end_time\x18\x01 \x01(\r\"\r\n\x0bUserDeleted\"Z\n\x0c\x41liasCreated\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x12\n\nalias_note\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\"K\n\x11\x41liasStatusChange\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x03 \x01(\x08\"5\n\x0c\x41liasDeleted\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\"\xce\x02\n\x0c\x45ventContent\x12>\n\x10user_plan_change\x18\x01 \x01(\x0b\x32\".simplelogin_events.UserPlanChangeH\x00\x12\x37\n\x0cuser_deleted\x18\x02 \x01(\x0b\x32\x1f.simplelogin_events.UserDeletedH\x00\x12\x39\n\ralias_created\x18\x03 \x01(\x0b\x32 .simplelogin_events.AliasCreatedH\x00\x12\x44\n\x13\x61lias_status_change\x18\x04 \x01(\x0b\x32%.simplelogin_events.AliasStatusChangeH\x00\x12\x39\n\ralias_deleted\x18\x05 \x01(\x0b\x32 .simplelogin_events.AliasDeletedH\x00\x42\t\n\x07\x63ontent\"y\n\x05\x45vent\x12\x0f\n\x07user_id\x18\x01 \x01(\r\x12\x18\n\x10\x65xternal_user_id\x18\x02 \x01(\t\x12\x12\n\npartner_id\x18\x03 \x01(\r\x12\x31\n\x07\x63ontent\x18\x04 \x01(\x0b\x32 .simplelogin_events.EventContentb\x06proto3')
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'event_pb2', _globals)
if not _descriptor._USE_C_DESCRIPTORS:
DESCRIPTOR._loaded_options = None
_globals['_USERPLANCHANGE']._serialized_start=35
_globals['_USERPLANCHANGE']._serialized_end=74
_globals['_USERDELETED']._serialized_start=76
_globals['_USERDELETED']._serialized_end=89
_globals['_ALIASCREATED']._serialized_start=91
_globals['_ALIASCREATED']._serialized_end=181
_globals['_ALIASSTATUSCHANGE']._serialized_start=183
_globals['_ALIASSTATUSCHANGE']._serialized_end=258
_globals['_ALIASDELETED']._serialized_start=260
_globals['_ALIASDELETED']._serialized_end=313
_globals['_EVENTCONTENT']._serialized_start=316
_globals['_EVENTCONTENT']._serialized_end=650
_globals['_EVENT']._serialized_start=652
_globals['_EVENT']._serialized_end=773
# @@protoc_insertion_point(module_scope)

View File

@ -0,0 +1,71 @@
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from typing import ClassVar as _ClassVar, Mapping as _Mapping, Optional as _Optional, Union as _Union
DESCRIPTOR: _descriptor.FileDescriptor
class UserPlanChange(_message.Message):
__slots__ = ("plan_end_time",)
PLAN_END_TIME_FIELD_NUMBER: _ClassVar[int]
plan_end_time: int
def __init__(self, plan_end_time: _Optional[int] = ...) -> None: ...
class UserDeleted(_message.Message):
__slots__ = ()
def __init__(self) -> None: ...
class AliasCreated(_message.Message):
__slots__ = ("alias_id", "alias_email", "alias_note", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ALIAS_NOTE_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
alias_note: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., alias_note: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasStatusChange(_message.Message):
__slots__ = ("alias_id", "alias_email", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasDeleted(_message.Message):
__slots__ = ("alias_id", "alias_email")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ...) -> None: ...
class EventContent(_message.Message):
__slots__ = ("user_plan_change", "user_deleted", "alias_created", "alias_status_change", "alias_deleted")
USER_PLAN_CHANGE_FIELD_NUMBER: _ClassVar[int]
USER_DELETED_FIELD_NUMBER: _ClassVar[int]
ALIAS_CREATED_FIELD_NUMBER: _ClassVar[int]
ALIAS_STATUS_CHANGE_FIELD_NUMBER: _ClassVar[int]
ALIAS_DELETED_FIELD_NUMBER: _ClassVar[int]
user_plan_change: UserPlanChange
user_deleted: UserDeleted
alias_created: AliasCreated
alias_status_change: AliasStatusChange
alias_deleted: AliasDeleted
def __init__(self, user_plan_change: _Optional[_Union[UserPlanChange, _Mapping]] = ..., user_deleted: _Optional[_Union[UserDeleted, _Mapping]] = ..., alias_created: _Optional[_Union[AliasCreated, _Mapping]] = ..., alias_status_change: _Optional[_Union[AliasStatusChange, _Mapping]] = ..., alias_deleted: _Optional[_Union[AliasDeleted, _Mapping]] = ...) -> None: ...
class Event(_message.Message):
__slots__ = ("user_id", "external_user_id", "partner_id", "content")
USER_ID_FIELD_NUMBER: _ClassVar[int]
EXTERNAL_USER_ID_FIELD_NUMBER: _ClassVar[int]
PARTNER_ID_FIELD_NUMBER: _ClassVar[int]
CONTENT_FIELD_NUMBER: _ClassVar[int]
user_id: int
external_user_id: str
partner_id: int
content: EventContent
def __init__(self, user_id: _Optional[int] = ..., external_user_id: _Optional[str] = ..., partner_id: _Optional[int] = ..., content: _Optional[_Union[EventContent, _Mapping]] = ...) -> None: ...

View File

@ -30,7 +30,9 @@ def apply_dmarc_policy_for_forward_phase(
) -> Tuple[Message, Optional[str]]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.forward)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return msg, None
LOG.i(f"Spam check result in {spam_result}")
from_header = get_header_unicode(msg[headers.FROM])
@ -131,7 +133,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
refused_email = RefusedEmail.create(
full_report_path=s3_report_path, user_id=alias.user_id, flush=True
)
return EmailLog.create(
email_log = EmailLog.create(
user_id=alias.user_id,
mailbox_id=alias.mailbox_id,
contact_id=contact.id,
@ -142,6 +144,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
blocked=True,
commit=True,
)
return email_log
def apply_dmarc_policy_for_reply_phase(
@ -149,8 +152,10 @@ def apply_dmarc_policy_for_reply_phase(
) -> Optional[str]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.reply)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return None
LOG.i(f"Spam check result is {spam_result}")
if spam_result.dmarc not in (
DmarcCheckResult.quarantine,
DmarcCheckResult.reject,

View File

@ -3,6 +3,7 @@ from email.header import Header
from email.message import Message
from app.email import headers
from app import config
from app.email_utils import add_or_replace_header, delete_header
from app.handler.unsubscribe_encoder import (
UnsubscribeEncoder,
@ -47,6 +48,11 @@ class UnsubscribeGenerator:
method = raw_method[start + 1 : end]
url_data = urllib.parse.urlparse(method)
if url_data.scheme == "mailto":
if url_data.path == config.UNSUBSCRIBER:
LOG.debug(
f"Skipping replacing unsubscribe since the original email already points to {config.UNSUBSCRIBER}"
)
return message
query_data = urllib.parse.parse_qs(url_data.query)
mailto_unsubs = (url_data.path, query_data.get("subject", [""])[0])
LOG.debug(f"Unsub is mailto to {mailto_unsubs}")

View File

@ -5,6 +5,7 @@ from typing import Optional
from aiosmtpd.smtp import Envelope
from app import config
from app import alias_utils
from app.db import Session
from app.email import headers, status
from app.email_utils import (
@ -101,7 +102,7 @@ class UnsubscribeHandler:
mailbox.email, alias
):
return status.E509
alias.enabled = False
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
enable_alias_url = config.URL + f"/dashboard/?highlight_alias_id={alias.id}"
for mailbox in alias.mailboxes:

View File

@ -30,7 +30,10 @@ def handle_batch_import(batch_import: BatchImport):
LOG.d("Download file %s from %s", batch_import.file, file_url)
r = requests.get(file_url)
lines = [line.decode("utf-8") for line in r.iter_lines()]
# Replace invisible character
lines = [
line.decode("utf-8").replace("\ufeff", "").strip() for line in r.iter_lines()
]
import_from_csv(batch_import, user, lines)

View File

@ -27,7 +27,7 @@ from sqlalchemy.orm import deferred
from sqlalchemy.sql import and_
from sqlalchemy_utils import ArrowType
from app import config
from app import config, rate_limiter
from app import s3
from app.db import Session
from app.dns_utils import get_mx_domains
@ -235,6 +235,7 @@ class AuditLogActionEnum(EnumE):
download_provider_complaint = 8
disable_user = 9
enable_user = 10
stop_trial = 11
class Phase(EnumE):
@ -524,6 +525,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
sa.Boolean, default=True, nullable=False, server_default="1"
)
# user opted in for data breach check
enable_data_breach_check = sa.Column(
sa.Boolean, default=False, nullable=False, server_default="0"
)
# bitwise flags. Allow for future expansion
flags = sa.Column(
sa.BigInteger,
@ -651,6 +657,21 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return user
@classmethod
def delete(cls, obj_id, commit=False):
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import UserDeleted, EventContent
user: User = cls.get(obj_id)
EventDispatcher.send_event(user, EventContent(user_deleted=UserDeleted()))
res = super(User, cls).delete(obj_id)
if commit:
Session.commit()
return res
def get_active_subscription(
self, include_partner_subscription: bool = True
) -> Optional[
@ -726,6 +747,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return True
def is_active(self) -> bool:
if self.delete_on is None:
return True
return self.delete_on < arrow.now()
def in_trial(self):
"""return True if user does not have lifetime licence or an active subscription AND is in trial period"""
if self.lifetime_or_active_subscription():
@ -827,6 +853,9 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
Whether user can create a new alias. User can't create a new alias if
- has more than 15 aliases in the free plan, *even in the free trial*
"""
if not self.is_active():
return False
if self.disabled:
return False
@ -907,7 +936,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return sub
def verified_custom_domains(self) -> List["CustomDomain"]:
return CustomDomain.filter_by(user_id=self.id, ownership_verified=True).all()
return (
CustomDomain.filter_by(user_id=self.id, ownership_verified=True)
.order_by(CustomDomain.domain.asc())
.all()
)
def mailboxes(self) -> List["Mailbox"]:
"""list of mailbox that user own"""
@ -1113,6 +1146,13 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return random_words(1)
def can_create_contacts(self) -> bool:
if self.is_premium():
return True
if self.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def __repr__(self):
return f"<User {self.id} {self.name} {self.email}>"
@ -1402,6 +1442,9 @@ def generate_random_alias_email(
class Alias(Base, ModelMixin):
__tablename__ = "alias"
FLAG_PARTNER_CREATED = 1 << 0
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
@ -1411,6 +1454,9 @@ class Alias(Base, ModelMixin):
name = sa.Column(sa.String(128), nullable=True, default=None)
enabled = sa.Column(sa.Boolean(), default=True, nullable=False)
flags = sa.Column(
sa.BigInteger(), default=0, server_default="0", nullable=False, index=True
)
custom_domain_id = sa.Column(
sa.ForeignKey("custom_domain.id", ondelete="cascade"), nullable=True, index=True
@ -1488,6 +1534,8 @@ class Alias(Base, ModelMixin):
TSVector(), sa.Computed("to_tsvector('english', note)", persisted=True)
)
last_email_log_id = sa.Column(sa.Integer, default=None, nullable=True)
__table_args__ = (
Index("ix_video___ts_vector__", ts_vector, postgresql_using="gin"),
# index on note column using pg_trgm
@ -1506,7 +1554,8 @@ class Alias(Base, ModelMixin):
def mailboxes(self):
ret = [self.mailbox]
for m in self._mailboxes:
ret.append(m)
if m.id is not self.mailbox.id:
ret.append(m)
ret = [mb for mb in ret if mb.verified]
ret = sorted(ret, key=lambda mb: mb.email)
@ -1555,6 +1604,15 @@ class Alias(Base, ModelMixin):
flush = kw.pop("flush", False)
new_alias = cls(**kw)
user = User.get(new_alias.user_id)
if user.is_premium():
limits = config.ALIAS_CREATE_RATE_LIMIT_PAID
else:
limits = config.ALIAS_CREATE_RATE_LIMIT_FREE
# limits is array of (hits,days)
for limit in limits:
key = f"alias_create_{limit[1]}d:{user.id}"
rate_limiter.check_bucket_limit(key, limit[0], limit[1])
email = kw["email"]
# make sure email is lowercase and doesn't have any whitespace
@ -1576,6 +1634,18 @@ class Alias(Base, ModelMixin):
Session.add(new_alias)
DailyMetric.get_or_create_today_metric().nb_alias += 1
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasCreated, EventContent
event = AliasCreated(
alias_id=new_alias.id,
alias_email=new_alias.email,
alias_note=new_alias.note,
enabled=True,
)
EventDispatcher.send_event(user, EventContent(alias_created=event))
if commit:
Session.commit()
@ -2037,6 +2107,20 @@ class EmailLog(Base, ModelMixin):
def get_dashboard_url(self):
return f"{config.URL}/dashboard/refused_email?highlight_id={self.id}"
@classmethod
def create(cls, *args, **kwargs):
commit = kwargs.pop("commit", False)
email_log = super().create(*args, **kwargs)
Session.flush()
if "alias_id" in kwargs:
sql = "UPDATE alias SET last_email_log_id = :el_id WHERE id = :alias_id"
Session.execute(
sql, {"el_id": email_log.id, "alias_id": kwargs["alias_id"]}
)
if commit:
Session.commit()
return email_log
def __repr__(self):
return f"<EmailLog {self.id}>"
@ -2540,10 +2624,13 @@ class Job(Base, ModelMixin):
nullable=False,
server_default=str(JobState.ready.value),
default=JobState.ready.value,
index=True,
)
attempts = sa.Column(sa.Integer, nullable=False, server_default="0", default=0)
taken_at = sa.Column(ArrowType, nullable=True)
__table_args__ = (Index("ix_state_run_at_taken_at", state, run_at, taken_at),)
def __repr__(self):
return f"<Job {self.id} {self.name} {self.payload}>"
@ -2589,10 +2676,15 @@ class Mailbox(Base, ModelMixin):
return False
def nb_alias(self):
return (
AliasMailbox.filter_by(mailbox_id=self.id).count()
+ Alias.filter_by(mailbox_id=self.id).count()
alias_ids = set(
am.alias_id
for am in AliasMailbox.filter_by(mailbox_id=self.id).values(
AliasMailbox.alias_id
)
)
for alias in Alias.filter_by(mailbox_id=self.id).values(Alias.id):
alias_ids.add(alias.id)
return len(alias_ids)
def is_proton(self) -> bool:
if (
@ -2641,12 +2733,15 @@ class Mailbox(Base, ModelMixin):
@property
def aliases(self) -> [Alias]:
ret = Alias.filter_by(mailbox_id=self.id).all()
ret = dict(
(alias.id, alias) for alias in Alias.filter_by(mailbox_id=self.id).all()
)
for am in AliasMailbox.filter_by(mailbox_id=self.id):
ret.append(am.alias)
if am.alias_id not in ret:
ret[am.alias_id] = am.alias
return ret
return list(ret.values())
@classmethod
def create(cls, **kw):
@ -2891,7 +2986,9 @@ class RecoveryCode(Base, ModelMixin):
class Notification(Base, ModelMixin):
__tablename__ = "notification"
user_id = sa.Column(sa.ForeignKey(User.id, ondelete="cascade"), nullable=False)
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
message = sa.Column(sa.Text, nullable=False)
title = sa.Column(sa.String(512))
@ -3132,6 +3229,20 @@ class TransactionalEmail(Base, ModelMixin):
__table_args__ = (sa.Index("ix_transactional_email_created_at", "created_at"),)
@classmethod
def create(cls, **kw):
# whether to call Session.commit
commit = kw.pop("commit", False)
r = cls(**kw)
if not config.STORE_TRANSACTIONAL_EMAILS:
return r
Session.add(r)
if commit:
Session.commit()
return r
class Payout(Base, ModelMixin):
"""Referral payouts"""
@ -3322,6 +3433,15 @@ class AdminAuditLog(Base):
},
)
@classmethod
def stop_trial(cls, admin_user_id: int, user_id: int):
cls.create(
admin_user_id=admin_user_id,
action=AuditLogActionEnum.stop_trial.value,
model="User",
model_id=user_id,
)
@classmethod
def disable_otp_fido(
cls, admin_user_id: int, user_id: int, had_otp: bool, had_fido: bool
@ -3555,3 +3675,52 @@ class ApiToCookieToken(Base, ModelMixin):
code = secrets.token_urlsafe(32)
return super().create(code=code, **kwargs)
class SyncEvent(Base, ModelMixin):
"""This model holds the events that need to be sent to the webhook"""
__tablename__ = "sync_event"
content = sa.Column(sa.LargeBinary, unique=False, nullable=False)
taken_time = sa.Column(
ArrowType, default=None, nullable=True, server_default=None, index=True
)
__table_args__ = (
sa.Index("ix_sync_event_created_at", "created_at"),
sa.Index("ix_sync_event_taken_time", "taken_time"),
)
def mark_as_taken(self) -> bool:
sql = """
UPDATE sync_event
SET taken_time = :taken_time
WHERE id = :sync_event_id
AND taken_time IS NULL
"""
args = {"taken_time": arrow.now().datetime, "sync_event_id": self.id}
res = Session.execute(sql, args)
Session.commit()
return res.rowcount > 0
@classmethod
def get_dead_letter(cls, older_than: Arrow) -> [SyncEvent]:
return (
SyncEvent.filter(
(
(
SyncEvent.taken_time.isnot(None)
& (SyncEvent.taken_time < older_than)
)
| (
SyncEvent.taken_time.is_(None)
& (SyncEvent.created_at < older_than)
)
)
)
.order_by(SyncEvent.id)
.limit(100)
.all()
)

View File

@ -140,7 +140,7 @@ def authorize():
Scope=Scope,
)
else: # POST - user allows or denies
if not current_user.is_authenticated or not current_user.is_active:
if not current_user.is_authenticated or not current_user.is_active():
LOG.i(
"Attempt to validate a OAUth allow request by an unauthenticated user"
)

42
app/rate_limiter.py Normal file
View File

@ -0,0 +1,42 @@
from datetime import datetime
from typing import Optional
import newrelic.agent
import redis.exceptions
import werkzeug.exceptions
from limits.storage import RedisStorage
from app.log import LOG
lock_redis: Optional[RedisStorage] = None
def set_redis_concurrent_lock(redis: RedisStorage):
global lock_redis
lock_redis = redis
def check_bucket_limit(
lock_name: Optional[str] = None,
max_hits: int = 5,
bucket_seconds: int = 3600,
):
# Calculate current bucket time
int_time = int(datetime.utcnow().timestamp())
bucket_id = int_time - (int_time % bucket_seconds)
bucket_lock_name = f"bl:{lock_name}:{bucket_id}"
if not lock_redis:
return
try:
value = lock_redis.incr(bucket_lock_name, bucket_seconds)
if value > max_hits:
LOG.i(
f"Rate limit hit for {lock_name} (bucket id {bucket_id}) -> {value}/{max_hits}"
)
newrelic.agent.record_custom_event(
"BucketRateLimit",
{"lock_name": lock_name, "bucket_seconds": bucket_seconds},
)
raise werkzeug.exceptions.TooManyRequests()
except (redis.exceptions.RedisError, AttributeError):
LOG.e("Cannot connect to redis")

View File

@ -2,6 +2,7 @@ import flask
import limits.storage
from app.parallel_limiter import set_redis_concurrent_lock
from app.rate_limiter import set_redis_concurrent_lock as rate_limit_set_redis
from app.session import RedisSessionStore
@ -10,12 +11,14 @@ def initialize_redis_services(app: flask.Flask, redis_url: str):
storage = limits.storage.RedisStorage(redis_url)
app.session_interface = RedisSessionStore(storage.storage, storage.storage, app)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
elif redis_url.startswith("redis+sentinel://"):
storage = limits.storage.RedisSentinelStorage(redis_url)
app.session_interface = RedisSessionStore(
storage.storage, storage.storage_slave, app
)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
else:
raise RuntimeError(
f"Tried to set_redis_session with an invalid redis url: ${redis_url}"

View File

@ -5,19 +5,9 @@ from typing import Optional
import boto3
import requests
from app.config import (
AWS_REGION,
BUCKET,
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
LOCAL_FILE_UPLOAD,
UPLOAD_DIR,
URL,
AWS_ENDPOINT_URL,
)
from app import config
from app.log import LOG
_s3_client = None
@ -25,12 +15,12 @@ def _get_s3client():
global _s3_client
if _s3_client is None:
args = {
"aws_access_key_id": AWS_ACCESS_KEY_ID,
"aws_secret_access_key": AWS_SECRET_ACCESS_KEY,
"region_name": AWS_REGION,
"aws_access_key_id": config.AWS_ACCESS_KEY_ID,
"aws_secret_access_key": config.AWS_SECRET_ACCESS_KEY,
"region_name": config.AWS_REGION,
}
if AWS_ENDPOINT_URL:
args["endpoint_url"] = AWS_ENDPOINT_URL
if config.AWS_ENDPOINT_URL:
args["endpoint_url"] = config.AWS_ENDPOINT_URL
_s3_client = boto3.client("s3", **args)
return _s3_client
@ -38,8 +28,8 @@ def _get_s3client():
def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-stream"):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, key)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, key)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
@ -47,7 +37,7 @@ def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-s
else:
_get_s3client().put_object(
Bucket=BUCKET,
Bucket=config.BUCKET,
Key=key,
Body=bs,
ContentType=content_type,
@ -57,8 +47,8 @@ def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-s
def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
@ -66,7 +56,7 @@ def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
else:
_get_s3client().put_object(
Bucket=BUCKET,
Bucket=config.BUCKET,
Key=path,
Body=bs,
# Support saving a remote file using Http header
@ -77,12 +67,12 @@ def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
def download_email(path: str) -> Optional[str]:
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
with open(file_path, "rb") as f:
return f.read()
resp = _get_s3client().get_object(
Bucket=BUCKET,
Bucket=config.BUCKET,
Key=path,
)
if not resp or "Body" not in resp:
@ -96,29 +86,30 @@ def upload_from_url(url: str, upload_path):
def get_url(key: str, expires_in=3600) -> str:
if LOCAL_FILE_UPLOAD:
return URL + "/static/upload/" + key
if config.LOCAL_FILE_UPLOAD:
return config.URL + "/static/upload/" + key
else:
return _get_s3client().generate_presigned_url(
ExpiresIn=expires_in,
ClientMethod="get_object",
Params={"Bucket": BUCKET, "Key": key},
Params={"Bucket": config.BUCKET, "Key": key},
)
def delete(path: str):
if LOCAL_FILE_UPLOAD:
os.remove(os.path.join(UPLOAD_DIR, path))
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
os.remove(file_path)
else:
_get_s3client().delete_object(Bucket=BUCKET, Key=path)
_get_s3client().delete_object(Bucket=config.BUCKET, Key=path)
def create_bucket_if_not_exists():
s3client = _get_s3client()
buckets = s3client.list_buckets()
for bucket in buckets["Buckets"]:
if bucket["Name"] == BUCKET:
if bucket["Name"] == config.BUCKET:
LOG.i("Bucket already exists")
return
s3client.create_bucket(Bucket=BUCKET)
LOG.i(f"Bucket {BUCKET} created")
s3client.create_bucket(Bucket=config.BUCKET)
LOG.i(f"Bucket {config.BUCKET} created")

View File

@ -2,6 +2,8 @@ import requests
from requests import RequestException
from app import config
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import EventContent, UserPlanChange
from app.log import LOG
from app.models import User
@ -31,3 +33,6 @@ def execute_subscription_webhook(user: User):
)
except RequestException as e:
LOG.error(f"Subscription request exception: {e}")
event = UserPlanChange(plan_end_time=sl_subscription_end)
EventDispatcher.send_event(user, EventContent(user_plan_change=event))

View File

@ -49,11 +49,11 @@ def random_string(length=10, include_digits=False):
def convert_to_id(s: str):
"""convert a string to id-like: remove space, remove special accent"""
s = s.replace(" ", "")
s = s.lower()
s = unidecode(s)
s = s.replace(" ", "")
return s
return s[:256]
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-."

176
cron.py
View File

@ -61,6 +61,11 @@ from app.pgp_utils import load_public_key_and_check, PGPException
from app.proton.utils import get_proton_partner
from app.utils import sanitize_email
from server import create_light_app
from tasks.cleanup_old_imports import cleanup_old_imports
from tasks.cleanup_old_jobs import cleanup_old_jobs
from tasks.cleanup_old_notifications import cleanup_old_notifications
DELETE_GRACE_DAYS = 30
def notify_trial_end():
@ -960,6 +965,9 @@ async def _hibp_check(api_key, queue):
This function to be ran simultaneously (multiple _hibp_check functions with different keys on the same queue) to make maximum use of multiple API keys.
"""
default_rate_sleep = (60.0 / config.HIBP_RPM) + 0.1
rate_sleep = default_rate_sleep
rate_hit_counter = 0
while True:
try:
alias_id = queue.get_nowait()
@ -967,9 +975,14 @@ async def _hibp_check(api_key, queue):
return
alias = Alias.get(alias_id)
# an alias can be deleted in the meantime
if not alias:
return
continue
user = alias.user
if user.disabled or not user.is_paid():
# Mark it as hibp done to skip it as if it had been checked
alias.hibp_last_check = arrow.utcnow()
Session.commit()
continue
LOG.d("Checking HIBP for %s", alias)
@ -981,7 +994,6 @@ async def _hibp_check(api_key, queue):
f"https://haveibeenpwned.com/api/v3/breachedaccount/{urllib.parse.quote(alias.email)}",
headers=request_headers,
)
if r.status_code == 200:
# Breaches found
alias.hibp_breaches = [
@ -989,20 +1001,27 @@ async def _hibp_check(api_key, queue):
]
if len(alias.hibp_breaches) > 0:
LOG.w("%s appears in HIBP breaches %s", alias, alias.hibp_breaches)
if rate_hit_counter > 0:
rate_hit_counter -= 1
elif r.status_code == 404:
# No breaches found
alias.hibp_breaches = []
elif r.status_code == 429:
# rate limited
LOG.w("HIBP rate limited, check alias %s in the next run", alias)
await asyncio.sleep(1.6)
return
rate_hit_counter += 1
rate_sleep = default_rate_sleep + (0.2 * rate_hit_counter)
if rate_hit_counter > 10:
LOG.w(f"HIBP rate limited too many times stopping with alias {alias}")
return
# Just sleep for a while
asyncio.sleep(5)
elif r.status_code > 500:
LOG.w("HIBP server 5** error %s", r.status_code)
return
else:
LOG.error(
"An error occured while checking alias %s: %s - %s",
"An error occurred while checking alias %s: %s - %s",
alias,
r.status_code,
r.text,
@ -1013,9 +1032,63 @@ async def _hibp_check(api_key, queue):
Session.add(alias)
Session.commit()
LOG.d("Updated breaches info for %s", alias)
LOG.d("Updated breach info for %s", alias)
await asyncio.sleep(rate_sleep)
await asyncio.sleep(1.6)
def get_alias_to_check_hibp(
oldest_hibp_allowed: arrow.Arrow,
user_ids_to_skip: list[int],
min_alias_id: int,
max_alias_id: int,
):
now = arrow.now()
alias_query = (
Session.query(Alias)
.join(User, User.id == Alias.user_id)
.join(Subscription, User.id == Subscription.user_id, isouter=True)
.join(ManualSubscription, User.id == ManualSubscription.user_id, isouter=True)
.join(AppleSubscription, User.id == AppleSubscription.user_id, isouter=True)
.join(
CoinbaseSubscription,
User.id == CoinbaseSubscription.user_id,
isouter=True,
)
.join(PartnerUser, User.id == PartnerUser.user_id, isouter=True)
.join(
PartnerSubscription,
PartnerSubscription.partner_user_id == PartnerUser.id,
isouter=True,
)
.filter(
or_(
Alias.hibp_last_check.is_(None),
Alias.hibp_last_check < oldest_hibp_allowed,
),
Alias.user_id.notin_(user_ids_to_skip),
Alias.enabled,
Alias.id >= min_alias_id,
Alias.id < max_alias_id,
User.disabled == False, # noqa: E712
User.enable_data_breach_check,
or_(
User.lifetime,
ManualSubscription.end_at > now,
Subscription.next_bill_date > now.date(),
AppleSubscription.expires_date > now,
CoinbaseSubscription.end_at > now,
PartnerSubscription.end_at > now,
),
)
)
if config.HIBP_SKIP_PARTNER_ALIAS:
alias_query = alias_query.filter(
Alias.flags.op("&")(Alias.FLAG_PARTNER_CREATED) == 0
)
for alias in (
alias_query.order_by(Alias.id.asc()).enable_eagerloads(False).yield_per(500)
):
yield alias
async def check_hibp():
@ -1038,40 +1111,49 @@ async def check_hibp():
Session.commit()
LOG.d("Updated list of known breaches")
LOG.d("Preparing list of aliases to check")
LOG.d("Getting the list of users to skip")
query = "select u.id, count(a.id) from users u, alias a where a.user_id=u.id group by u.id having count(a.id) > :max_alias"
rows = Session.execute(query, {"max_alias": config.HIBP_MAX_ALIAS_CHECK})
user_ids = [row[0] for row in rows]
LOG.d("Got %d users to skip" % len(user_ids))
LOG.d("Checking aliases")
queue = asyncio.Queue()
max_date = arrow.now().shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
for alias in (
Alias.filter(
or_(Alias.hibp_last_check.is_(None), Alias.hibp_last_check < max_date)
min_alias_id = 0
max_alias_id = Session.query(func.max(Alias.id)).scalar()
step = 10000
now = arrow.now()
oldest_hibp_allowed = now.shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
alias_checked = 0
for alias_batch_id in range(min_alias_id, max_alias_id, step):
for alias in get_alias_to_check_hibp(
oldest_hibp_allowed, user_ids, alias_batch_id, alias_batch_id + step
):
await queue.put(alias.id)
alias_checked += queue.qsize()
LOG.d(
f"Need to check about {queue.qsize()} aliases in this loop {alias_batch_id}/{max_alias_id}"
)
.filter(Alias.enabled)
.order_by(Alias.hibp_last_check.asc())
.yield_per(500)
.enable_eagerloads(False)
):
await queue.put(alias.id)
LOG.d("Need to check about %s aliases", queue.qsize())
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
)
)
)
checkers.append(checker)
checkers.append(checker)
# Wait until all checking processes are done
for checker in checkers:
await checker
# Wait until all checking processes are done
for checker in checkers:
await checker
LOG.d("Done checking HIBP API for aliases in breaches")
LOG.d(f"Done checking {alias_checked} HIBP API for aliases in breaches")
def notify_hibp():
@ -1126,18 +1208,30 @@ def notify_hibp():
Session.commit()
def clear_users_scheduled_to_be_deleted():
def clear_users_scheduled_to_be_deleted(dry_run=False):
users = User.filter(
and_(User.delete_on.isnot(None), User.delete_on < arrow.now())
and_(
User.delete_on.isnot(None),
User.delete_on <= arrow.now().shift(days=-DELETE_GRACE_DAYS),
)
).all()
for user in users:
LOG.i(
f"Scheduled deletion of user {user} with scheduled delete on {user.delete_on}"
)
if dry_run:
continue
User.delete(user.id)
Session.commit()
def delete_old_data():
oldest_valid = arrow.now().shift(days=-config.KEEP_OLD_DATA_DAYS)
cleanup_old_imports(oldest_valid)
cleanup_old_jobs(oldest_valid)
cleanup_old_notifications(oldest_valid)
if __name__ == "__main__":
LOG.d("Start running cronjob")
parser = argparse.ArgumentParser()
@ -1152,6 +1246,7 @@ if __name__ == "__main__":
"notify_manual_subscription_end",
"notify_premium_end",
"delete_logs",
"delete_old_data",
"poll_apple_subscription",
"sanity_check",
"delete_old_monitoring",
@ -1180,6 +1275,9 @@ if __name__ == "__main__":
elif args.job == "delete_logs":
LOG.d("Deleted Logs")
delete_logs()
elif args.job == "delete_old_data":
LOG.d("Delete old data")
delete_old_data()
elif args.job == "poll_apple_subscription":
LOG.d("Poll Apple Subscriptions")
poll_apple_subscription()
@ -1206,4 +1304,4 @@ if __name__ == "__main__":
load_unsent_mails_from_fs_and_resend()
elif args.job == "delete_scheduled_users":
LOG.d("Deleting users scheduled to be deleted")
clear_users_scheduled_to_be_deleted()
clear_users_scheduled_to_be_deleted(dry_run=True)

View File

@ -37,6 +37,12 @@ jobs:
schedule: "15 5 * * *"
captureStderr: true
- name: SimpleLogin Delete Old data
command: python /code/cron.py -j delete_old_data
shell: /bin/bash
schedule: "30 5 * * *"
captureStderr: true
- name: SimpleLogin Poll Apple Subscriptions
command: python /code/cron.py -j poll_apple_subscription
shell: /bin/bash
@ -62,7 +68,7 @@ jobs:
captureStderr: true
- name: SimpleLogin delete users scheduled to be deleted
command: echo disabled_user_deletion #python /code/cron.py -j delete_scheduled_users
command: python /code/cron.py -j delete_scheduled_users
shell: /bin/bash
schedule: "15 11 * * *"
captureStderr: true

View File

@ -388,7 +388,7 @@ Input:
- (Optional but recommended) `hostname` passed in query string
- Request Message Body in json (`Content-Type` is `application/json`)
- alias_prefix: string. The first part of the alias that user can choose.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v4/alias/options` endpoint.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v5/alias/options` endpoint.
- mailbox_ids: list of mailbox_id that "owns" this alias
- (Optional) note: alias note
- (Optional) name: alias name

View File

@ -53,7 +53,7 @@ from flanker.addresslib.address import EmailAddress
from sqlalchemy.exc import IntegrityError
from app import pgp_utils, s3, config
from app.alias_utils import try_auto_create
from app.alias_utils import try_auto_create, change_alias_status
from app.config import (
EMAIL_DOMAIN,
URL,
@ -236,15 +236,16 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
Session.commit()
else:
try:
contact_email_for_reply = (
contact_email if is_valid_email(contact_email) else ""
)
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias.id,
website_email=contact_email,
name=contact_name,
mail_from=mail_from,
reply_email=generate_reply_email(contact_email, alias)
if is_valid_email(contact_email)
else NOREPLY,
reply_email=generate_reply_email(contact_email_for_reply, alias),
automatic_created=True,
)
if not contact_email:
@ -636,6 +637,10 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
user = alias.user
if not user.is_active():
LOG.w(f"User {user} has been soft deleted")
return False, status.E502
if not user.can_send_or_receive():
LOG.i(f"User {user} cannot receive emails")
if should_ignore_bounce(envelope.mail_from):
@ -870,6 +875,7 @@ def forward_email_to_mailbox(
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
headers.LIST_UNSUBSCRIBE,
headers.LIST_UNSUBSCRIBE_POST,
] + headers.MIME_HEADERS
@ -1055,6 +1061,9 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
if not contact:
LOG.w(f"No contact with {reply_email} as reverse alias")
return False, status.E502
if not contact.user.is_active():
LOG.w(f"User {contact.user} has been soft deleted")
return False, status.E502
alias = contact.alias
alias_address: str = contact.alias.email
@ -1171,6 +1180,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
]
+ headers.MIME_HEADERS,
)
@ -1575,7 +1585,7 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
LOG.w(
f"Disable alias {alias} because {reason}. {alias.mailboxes} {alias.user}. Last contact {contact}"
)
alias.enabled = False
change_alias_status(alias, enabled=False)
Notification.create(
user_id=user.id,
@ -1883,24 +1893,30 @@ def handle_transactional_bounce(
envelope: Envelope, msg, rcpt_to, transactional_id=None
):
LOG.d("handle transactional bounce sent to %s", rcpt_to)
if transactional_id is None:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
# parse the TransactionalEmail
transactional_id = transactional_id or parse_id_from_bounce(rcpt_to)
transactional = TransactionalEmail.get(transactional_id)
# a transaction might have been deleted in delete_logs()
if transactional:
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
if not transactional:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
@ -1921,6 +1937,9 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
contact,
alias,
)
if not email_log.user.is_active():
LOG.d(f"User {email_log.user} is not active")
return status.E510
if email_log.is_reply:
content_type = msg.get_content_type().lower()
@ -1982,6 +2001,9 @@ def send_no_reply_response(mail_from: str, msg: Message):
if not mailbox:
LOG.d("Unknown sender. Skipping reply from {}".format(NOREPLY))
return
if not mailbox.user.is_active():
LOG.d(f"User {mailbox.user} is soft-deleted. Skipping sending reply response")
return
send_email_at_most_times(
mailbox.user,
ALERT_TO_NOREPLY,
@ -2020,10 +2042,11 @@ def handle(envelope: Envelope, msg: Message) -> str:
return status.E204
# sanitize email headers
sanitize_header(msg, "from")
sanitize_header(msg, "to")
sanitize_header(msg, "cc")
sanitize_header(msg, "reply-to")
sanitize_header(msg, headers.FROM)
sanitize_header(msg, headers.TO)
sanitize_header(msg, headers.CC)
sanitize_header(msg, headers.REPLY_TO)
sanitize_header(msg, headers.MESSAGE_ID)
LOG.d(
"==>> Handle mail_from:%s, rcpt_tos:%s, header_from:%s, header_to:%s, "

64
event_listener.py Normal file
View File

@ -0,0 +1,64 @@
import argparse
from enum import Enum
from sys import argv, exit
from app.config import DB_URI
from app.log import LOG
from events.runner import Runner
from events.event_source import DeadLetterEventSource, PostgresEventSource
from events.event_sink import ConsoleEventSink, HttpEventSink
class Mode(Enum):
DEAD_LETTER = "dead_letter"
LISTENER = "listener"
@staticmethod
def from_str(value: str):
if value == Mode.DEAD_LETTER.value:
return Mode.DEAD_LETTER
elif value == Mode.LISTENER.value:
return Mode.LISTENER
else:
raise ValueError(f"Invalid mode: {value}")
def main(mode: Mode, dry_run: bool):
if mode == Mode.DEAD_LETTER:
LOG.i("Using DeadLetterEventSource")
source = DeadLetterEventSource()
elif mode == Mode.LISTENER:
LOG.i("Using PostgresEventSource")
source = PostgresEventSource(DB_URI)
else:
raise ValueError(f"Invalid mode: {mode}")
if dry_run:
LOG.i("Starting with ConsoleEventSink")
sink = ConsoleEventSink()
else:
LOG.i("Starting with HttpEventSink")
sink = HttpEventSink()
runner = Runner(source=source, sink=sink)
runner.run()
def args():
parser = argparse.ArgumentParser(description="Run event listener")
parser.add_argument(
"mode",
help="Mode to run",
choices=[Mode.DEAD_LETTER.value, Mode.LISTENER.value],
)
parser.add_argument("--dry-run", help="Dry run mode", action="store_true")
return parser.parse_args()
if __name__ == "__main__":
if len(argv) < 2:
print("Invalid usage. Pass 'listener' or 'dead_letter' as argument")
exit(1)
args = args()
main(Mode.from_str(args.mode), args.dry_run)

0
events/__init__.py Normal file
View File

42
events/event_sink.py Normal file
View File

@ -0,0 +1,42 @@
import requests
from abc import ABC, abstractmethod
from app.config import EVENT_WEBHOOK, EVENT_WEBHOOK_SKIP_VERIFY_SSL
from app.log import LOG
from app.models import SyncEvent
class EventSink(ABC):
@abstractmethod
def process(self, event: SyncEvent) -> bool:
pass
class HttpEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
if not EVENT_WEBHOOK:
LOG.warning("Skipping sending event because there is no webhook configured")
return False
LOG.info(f"Sending event {event.id} to {EVENT_WEBHOOK}")
res = requests.post(
url=EVENT_WEBHOOK,
data=event.content,
headers={"Content-Type": "application/x-protobuf"},
verify=not EVENT_WEBHOOK_SKIP_VERIFY_SSL,
)
if res.status_code != 200:
LOG.warning(
f"Failed to send event to webhook: {res.status_code} {res.text}"
)
return False
else:
LOG.info(f"Event {event.id} sent successfully to webhook")
return True
class ConsoleEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
LOG.info(f"Handling event {event.id}")
return True

100
events/event_source.py Normal file
View File

@ -0,0 +1,100 @@
import arrow
import newrelic.agent
import psycopg2
import select
from abc import ABC, abstractmethod
from app.log import LOG
from app.models import SyncEvent
from app.events.event_dispatcher import NOTIFICATION_CHANNEL
from time import sleep
from typing import Callable, NoReturn
_DEAD_LETTER_THRESHOLD_MINUTES = 10
_DEAD_LETTER_INTERVAL_SECONDS = 30
_POSTGRES_RECONNECT_INTERVAL_SECONDS = 5
class EventSource(ABC):
@abstractmethod
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
pass
class PostgresEventSource(EventSource):
def __init__(self, connection_string: str):
self.__connection_string = connection_string
self.__connect()
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
while True:
try:
self.__listen(on_event)
except Exception as e:
LOG.warn(f"Error listening to events: {e}")
sleep(_POSTGRES_RECONNECT_INTERVAL_SECONDS)
self.__connect()
def __listen(self, on_event: Callable[[SyncEvent], NoReturn]):
self.__connection.set_isolation_level(
psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT
)
cursor = self.__connection.cursor()
cursor.execute(f"LISTEN {NOTIFICATION_CHANNEL};")
while True:
if select.select([self.__connection], [], [], 5) != ([], [], []):
self.__connection.poll()
while self.__connection.notifies:
notify = self.__connection.notifies.pop(0)
LOG.debug(
f"Got NOTIFY: pid={notify.pid} channel={notify.channel} payload={notify.payload}"
)
try:
webhook_id = int(notify.payload)
event = SyncEvent.get_by(id=webhook_id)
if event is not None:
if event.mark_as_taken():
on_event(event)
else:
LOG.info(
f"Event {event.id} was handled by another runner"
)
else:
LOG.info(f"Could not find event with id={notify.payload}")
except Exception as e:
LOG.warn(f"Error getting event: {e}")
def __connect(self):
self.__connection = psycopg2.connect(self.__connection_string)
from app.db import Session
Session.close()
class DeadLetterEventSource(EventSource):
@newrelic.agent.background_task()
def run(self, on_event: Callable[[SyncEvent], NoReturn]):
while True:
try:
threshold = arrow.utcnow().shift(
minutes=-_DEAD_LETTER_THRESHOLD_MINUTES
)
events = SyncEvent.get_dead_letter(older_than=threshold)
if events:
LOG.info(f"Got {len(events)} dead letter events")
if events:
newrelic.agent.record_custom_metric(
"Custom/dead_letter_events_to_process", len(events)
)
for event in events:
on_event(event)
else:
LOG.debug("No dead letter events")
sleep(_DEAD_LETTER_INTERVAL_SECONDS)
except Exception as e:
LOG.warn(f"Error getting dead letter event: {e}")
sleep(_DEAD_LETTER_INTERVAL_SECONDS)

42
events/runner.py Normal file
View File

@ -0,0 +1,42 @@
import arrow
import newrelic.agent
from app.log import LOG
from app.models import SyncEvent
from events.event_sink import EventSink
from events.event_source import EventSource
class Runner:
def __init__(self, source: EventSource, sink: EventSink):
self.__source = source
self.__sink = sink
def run(self):
self.__source.run(self.__on_event)
@newrelic.agent.background_task()
def __on_event(self, event: SyncEvent):
try:
event_created_at = event.created_at
start_time = arrow.now()
success = self.__sink.process(event)
if success:
event_id = event.id
SyncEvent.delete(event.id, commit=True)
LOG.info(f"Marked {event_id} as done")
end_time = arrow.now() - start_time
time_between_taken_and_created = start_time - event_created_at
newrelic.agent.record_custom_metric("Custom/sync_event_processed", 1)
newrelic.agent.record_custom_metric(
"Custom/sync_event_process_time", end_time.total_seconds()
)
newrelic.agent.record_custom_metric(
"Custom/sync_event_elapsed_time",
time_between_taken_and_created.total_seconds(),
)
except Exception as e:
LOG.warn(f"Exception processing event [id={event.id}]: {e}")
newrelic.agent.record_custom_metric("Custom/sync_event_failed", 1)

View File

@ -116,6 +116,14 @@ WORDS_FILE_PATH=local_data/test_words.txt
# CONNECT_WITH_PROTON=true
# CONNECT_WITH_PROTON_COOKIE_NAME=to_fill
# Login with OIDC
# CONNECT_WITH_OIDC_ICON=fa-github
# OIDC_WELL_KNOWN_URL=to_fill
# OIDC_SCOPES=openid email profile
# OIDC_NAME_FIELD=name
# OIDC_CLIENT_ID=to_fill
# OIDC_CLIENT_SECRET=to_fill
# Flask profiler
# FLASK_PROFILER_PATH=/tmp/flask-profiler.sql
# FLASK_PROFILER_PASSWORD=password

View File

@ -192,7 +192,6 @@ amigos
amines
amnion
amoeba
amoral
amount
amours
ampere
@ -215,7 +214,6 @@ animus
anions
ankles
anklet
annals
anneal
annoys
annual
@ -364,7 +362,6 @@ auntie
aureus
aurora
author
autism
autumn
avails
avatar
@ -638,14 +635,12 @@ bigwig
bijoux
bikers
biking
bikini
bilges
bilked
bilker
billed
billet
billow
bimbos
binary
binder
binged
@ -710,8 +705,6 @@ blocks
blokes
blonde
blonds
bloods
bloody
blooms
bloops
blotch
@ -817,8 +810,6 @@ bounds
bounty
bovine
bovver
bowels
bowers
bowing
bowled
bowleg
@ -827,10 +818,8 @@ bowman
bowmen
bowwow
boxcar
boxers
boxier
boxing
boyish
braced
bracer
braces
@ -861,7 +850,6 @@ breach
breads
breaks
breams
breast
breath
breech
breeds
@ -872,9 +860,6 @@ brevet
brewed
brewer
briars
bribed
briber
bribes
bricks
bridal
brides
@ -926,13 +911,7 @@ buffed
buffer
buffet
bugged
bugger
bugled
bugler
bugles
builds
bulged
bulges
bulked
bulled
bullet
@ -1340,8 +1319,6 @@ clingy
clinic
clinks
clique
cloaca
cloaks
cloche
clocks
clomps
@ -1448,7 +1425,6 @@ comply
compos
conchs
concur
condom
condor
condos
coneys
@ -1568,8 +1544,6 @@ cranes
cranks
cranky
cranny
crapes
crappy
crated
crater
crates
@ -1585,7 +1559,6 @@ crazes
creaks
creaky
creams
creamy
crease
create
creche
@ -1594,8 +1567,6 @@ credos
creeds
creeks
creels
creeps
creepy
cremes
creole
crepes
@ -1728,9 +1699,6 @@ dainty
daises
damage
damask
dammed
dammit
damned
damped
dampen
damper
@ -1754,7 +1722,6 @@ darers
daring
darken
darker
darkie
darkly
darned
darner
@ -1763,8 +1730,6 @@ darter
dashed
dasher
dashes
daters
dating
dative
daubed
dauber
@ -1921,7 +1886,6 @@ dharma
dhotis
diadem
dialog
diaper
diatom
dibble
dicier
@ -1943,7 +1907,6 @@ digits
diking
diktat
dilate
dildos
dilute
dimity
dimmed
@ -2058,7 +2021,6 @@ dotted
double
doubly
doubts
douche
doughy
dourer
dourly
@ -2139,15 +2101,6 @@ duenna
duffed
duffer
dugout
dulcet
dulled
duller
dumber
dumbly
dumbos
dumdum
dumped
dumper
dunces
dunged
dunked
@ -2285,7 +2238,6 @@ endows
endued
endues
endure
enemas
energy
enfold
engage
@ -2333,7 +2285,6 @@ erects
ermine
eroded
erodes
erotic
errand
errant
errata
@ -2344,7 +2295,6 @@ eructs
erupts
escape
eschew
escort
escrow
escudo
espied
@ -2363,7 +2313,6 @@ ethnic
etudes
euchre
eulogy
eunuch
eureka
evaded
evader
@ -2392,7 +2341,6 @@ exempt
exerts
exeunt
exhale
exhort
exhume
exiled
exiles
@ -2415,7 +2363,6 @@ extant
extend
extent
extols
extort
extras
exuded
exudes
@ -2440,7 +2387,6 @@ faeces
faerie
faffed
fagged
faggot
failed
faille
fainer
@ -2473,18 +2419,10 @@ faring
farmed
farmer
farrow
farted
fascia
fasted
fasten
faster
father
fathom
fating
fatsos
fatten
fatter
fatwas
faucet
faults
faulty
@ -2532,7 +2470,6 @@ fesses
festal
fester
feting
fetish
fetter
fettle
feudal
@ -2617,9 +2554,7 @@ flaked
flakes
flambe
flamed
flamer
flames
flange
flanks
flared
flares
@ -2754,8 +2689,6 @@ franks
frappe
frauds
frayed
freaks
freaky
freely
freest
freeze
@ -2795,8 +2728,6 @@ fryers
frying
ftpers
ftping
fucked
fucker
fuddle
fudged
fudges
@ -2891,10 +2822,7 @@ gasbag
gashed
gashes
gasket
gasman
gasmen
gasped
gassed
gasses
gateau
gather
@ -3104,7 +3032,6 @@ grimed
grimes
grimly
grinds
gringo
griped
griper
gripes
@ -3186,8 +3113,6 @@ gypsum
gyrate
gyving
habits
hacked
hacker
hackle
hadith
haggis
@ -3195,8 +3120,6 @@ haggle
hailed
hairdo
haired
hajjes
hajjis
halest
haling
halite
@ -3223,11 +3146,8 @@ happen
haptic
harass
harden
harder
hardly
harems
haring
harked
harlot
harmed
harped
@ -3407,7 +3327,6 @@ hoofed
hoofer
hookah
hooked
hooker
hookup
hooped
hoopla
@ -3459,8 +3378,6 @@ huffed
hugely
hugest
hugged
hulled
huller
humane
humans
humble
@ -3667,8 +3584,6 @@ jacket
jading
jagged
jaguar
jailed
jailer
jalopy
jammed
jangle
@ -3689,8 +3604,6 @@ jejune
jelled
jellos
jennet
jerked
jerkin
jersey
jested
jester
@ -3814,11 +3727,7 @@ kidded
kidder
kiddie
kiddos
kidnap
kidney
killed
killer
kilned
kilted
kilter
kimono
@ -3827,15 +3736,11 @@ kinder
kindle
kindly
kingly
kinked
kiosks
kipped
kipper
kirsch
kismet
kissed
kisser
kisses
kiting
kitsch
kitted
@ -3847,10 +3752,6 @@ kluges
klutzy
knacks
knaves
kneads
kneels
knells
knifed
knifes
knight
knives
@ -4210,8 +4111,6 @@ lunges
lupine
lupins
luring
lurked
lurker
lusher
lushes
lushly
@ -4608,7 +4507,6 @@ muggle
mukluk
mulcts
mulish
mullah
mulled
mullet
mumble
@ -4721,9 +4619,6 @@ nickel
nicker
nickle
nieces
niggas
niggaz
nigger
niggle
nigher
nights
@ -4736,7 +4631,6 @@ ninjas
ninths
nipped
nipper
nipple
nitric
nitwit
nixing
@ -4781,15 +4675,6 @@ nozzle
nuance
nubbin
nubile
nuclei
nudest
nudged
nudges
nudism
nudist
nudity
nugget
nuking
numbed
number
numbly
@ -4804,7 +4689,6 @@ nutter
nuzzle
nybble
nylons
nympho
nymphs
oafish
oaring
@ -4885,7 +4769,6 @@ opting
option
opuses
oracle
orally
orange
orated
orates
@ -4897,7 +4780,6 @@ ordeal
orders
ordure
organs
orgasm
orgies
oriels
orient
@ -4993,10 +4875,6 @@ pander
panels
panics
panned
panted
pantie
pantos
pantry
papacy
papaya
papers
@ -5078,7 +4956,6 @@ pebble
pebbly
pecans
pecked
pecker
pectic
pectin
pedalo
@ -5151,9 +5028,6 @@ phenom
phials
phlegm
phloem
phobia
phobic
phoebe
phoned
phones
phoney
@ -5228,9 +5102,6 @@ piques
piracy
pirate
pirogi
pissed
pisser
pisses
pistes
pistil
pistol
@ -5311,8 +5182,6 @@ pogrom
points
pointy
poised
poises
poison
pokers
pokeys
pokier
@ -5422,7 +5291,6 @@ preyed
priced
prices
pricey
pricks
prided
prides
priers
@ -5602,14 +5470,9 @@ rabbit
rabble
rabies
raceme
racers
racial
racier
racily
racing
racism
racist
racked
racket
radars
radial
@ -5661,8 +5524,6 @@ rapers
rapids
rapier
rapine
raping
rapist
rapped
rappel
rapper
@ -5747,7 +5608,6 @@ recoup
rectal
rector
rectos
rectum
recurs
recuse
redact
@ -5891,7 +5751,6 @@ resume
retail
retain
retake
retard
retell
retest
retied
@ -6125,8 +5984,6 @@ sadden
sadder
saddle
sadhus
sadism
sadist
safari
safely
safest
@ -6364,16 +6221,6 @@ severs
sewage
sewers
sewing
sexier
sexily
sexing
sexism
sexist
sexpot
sextet
sexton
sexual
shabby
shacks
shaded
shades
@ -6383,10 +6230,7 @@ shaggy
shaken
shaker
shakes
shalom
shaman
shamed
shames
shandy
shanks
shanty
@ -6432,7 +6276,6 @@ shirks
shirrs
shirts
shirty
shitty
shiver
shoals
shoats
@ -6575,9 +6418,6 @@ slangy
slants
slated
slates
slaved
slaver
slaves
slayed
slayer
sleaze
@ -6672,7 +6512,6 @@ snarks
snarky
snarls
snarly
snatch
snazzy
sneaks
sneaky
@ -6716,7 +6555,6 @@ socket
sodded
sodden
sodium
sodomy
soever
soften
softer
@ -7468,7 +7306,6 @@ torrid
torsos
tortes
tossed
tosser
tosses
tossup
totals
@ -7686,7 +7523,6 @@ unhook
unhurt
unions
unique
unisex
unison
united
unites
@ -7793,7 +7629,6 @@ vacant
vacate
vacuum
vagary
vagina
vaguer
vainer
vainly
@ -7930,9 +7765,6 @@ votive
vowels
vowing
voyage
voyeur
vulgar
vulvae
wabbit
wacker
wackos
@ -7975,7 +7807,6 @@ wander
wangle
waning
wanked
wanker
wanner
wanted
wanton

View File

@ -1944,7 +1944,6 @@ dosage
dose
dotted
doubling
douche
dove
down
dowry
@ -3015,7 +3014,6 @@ groom
groove
grooving
groovy
grope
ground
grouped
grout
@ -3135,7 +3133,6 @@ happiness
happy
harbor
hardcopy
hardcore
hardcover
harddisk
hardened
@ -6553,7 +6550,6 @@ swimmer
swimming
swimsuit
swimwear
swinger
swinging
swipe
swirl
@ -7464,9 +7460,7 @@ villain
vindicate
vineyard
vintage
violate
violation
violator
violet
violin
viper

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,29 @@
"""empty message
Revision ID: 818b0a956205
Revises: 4bc54632d9aa
Create Date: 2024-02-01 10:43:46.253184
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '818b0a956205'
down_revision = '4bc54632d9aa'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('alias', sa.Column('last_email_log_id', sa.Integer(), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('alias', 'last_email_log_id')
# ### end Alembic commands ###

View File

@ -0,0 +1,48 @@
"""empty message
Revision ID: 52510a633d6f
Revises: 818b0a956205
Create Date: 2024-03-12 12:46:24.161644
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "52510a633d6f"
down_revision = "818b0a956205"
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"alias", sa.Column("flags", sa.BigInteger(), server_default="0", nullable=False)
)
with op.get_context().autocommit_block():
op.create_index(op.f("ix_alias_flags"), "alias", ["flags"], unique=False)
op.create_index(op.f("ix_job_state"), "job", ["state"], unique=False)
op.create_index(
"ix_state_run_at_taken_at",
"job",
["state", "run_at", "taken_at"],
unique=False,
)
op.create_index(
op.f("ix_notification_user_id"), "notification", ["user_id"], unique=False
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
with op.get_context().autocommit_block():
op.drop_index(op.f("ix_notification_user_id"), table_name="notification")
op.drop_index("ix_state_run_at_taken_at", table_name="job")
op.drop_index(op.f("ix_job_state"), table_name="job")
op.drop_index(op.f("ix_alias_flags"), table_name="alias")
op.drop_column("alias", "flags")
# ### end Alembic commands ###

View File

@ -0,0 +1,29 @@
"""empty message
Revision ID: fa2f19bb4e5a
Revises: 52510a633d6f
Create Date: 2024-04-09 13:12:26.305340
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'fa2f19bb4e5a'
down_revision = '52510a633d6f'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('users', sa.Column('enable_data_breach_check', sa.Boolean(), server_default='0', nullable=False))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('users', 'enable_data_breach_check')
# ### end Alembic commands ###

View File

@ -0,0 +1,38 @@
"""Create sync_event table
Revision ID: 06a9a7133445
Revises: fa2f19bb4e5a
Create Date: 2024-05-17 13:11:20.402259
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '06a9a7133445'
down_revision = 'fa2f19bb4e5a'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('sync_event',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('created_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=False),
sa.Column('updated_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.Column('content', sa.LargeBinary(), nullable=False),
sa.Column('taken_time', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_sync_event_created_at'), 'sync_event', ['created_at'], unique=False)
op.create_index(op.f('ix_sync_event_taken_time'), 'sync_event', ['taken_time'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('sync_event')
# ### end Alembic commands ###

View File

@ -4,6 +4,7 @@ import subprocess
from time import sleep
from typing import List, Dict
import arrow
import newrelic.agent
from app.db import Session
@ -93,11 +94,44 @@ def log_nb_db_connection():
newrelic.agent.record_custom_metric("Custom/nb_db_connections", nb_connection)
@newrelic.agent.background_task()
def log_pending_to_process_events():
r = Session.execute("select count(*) from sync_event WHERE taken_time IS NULL;")
events_pending = list(r)[0][0]
LOG.d("number of events pending to process %s", events_pending)
newrelic.agent.record_custom_metric(
"Custom/sync_events_pending_to_process", events_pending
)
@newrelic.agent.background_task()
def log_events_pending_dead_letter():
since = arrow.now().shift(minutes=-10).datetime
r = Session.execute(
"""
SELECT COUNT(*)
FROM sync_event
WHERE (taken_time IS NOT NULL AND taken_time < :since)
OR (taken_time IS NULL AND created_at < :since)
""",
{"since": since},
)
events_pending = list(r)[0][0]
LOG.d("number of events pending dead letter %s", events_pending)
newrelic.agent.record_custom_metric(
"Custom/sync_events_pending_dead_letter", events_pending
)
if __name__ == "__main__":
exporter = MetricExporter(get_newrelic_license())
while True:
log_postfix_metrics()
log_nb_db_connection()
log_pending_to_process_events()
log_events_pending_dead_letter()
Session.close()
exporter.run()

View File

@ -0,0 +1,44 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.models import Alias
from app.db import Session
parser = argparse.ArgumentParser(
prog="Backfill alias", description="Backfill alias las use"
)
parser.add_argument(
"-s", "--start_alias_id", default=0, type=int, help="Initial alias_id"
)
parser.add_argument("-e", "--end_alias_id", default=0, type=int, help="Last alias_id")
args = parser.parse_args()
alias_id_start = args.start_alias_id
max_alias_id = args.end_alias_id
if max_alias_id == 0:
max_alias_id = Session.query(func.max(Alias.id)).scalar()
print(f"Checking alias {alias_id_start} to {max_alias_id}")
step = 1000
el_query = "SELECT alias_id, MAX(id) from email_log where alias_id>=:start AND alias_id < :end GROUP BY alias_id"
alias_query = "UPDATE alias set last_email_log_id = :el_id where id = :alias_id"
updated = 0
start_time = time.time()
for batch_start in range(alias_id_start, max_alias_id, step):
rows = Session.execute(el_query, {"start": batch_start, "end": batch_start + step})
for row in rows:
Session.execute(alias_query, {"alias_id": row[0], "el_id": row[1]})
Session.commit()
updated += 1
elapsed = time.time() - start_time
time_per_alias = elapsed / (updated + 1)
last_batch_id = batch_start + step
remaining = max_alias_id - last_batch_id
time_remaining = (max_alias_id - last_batch_id) * time_per_alias
hours_remaining = time_remaining / 3600.0
print(
f"\rAlias {batch_start}/{max_alias_id} {updated} {hours_remaining:.2f}hrs remaining"
)
print("")

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
import argparse
import random
import time
from sqlalchemy import func
from app import config
from app.models import Alias, Contact
from app.db import Session
parser = argparse.ArgumentParser(
prog=f"Replace {config.NOREPLY}",
description=f"Replace {config.NOREPLY} from contacts reply email",
)
args = parser.parse_args()
max_alias_id: int = Session.query(func.max(Alias.id)).scalar()
start = time.time()
tests = 1000
for i in range(tests):
alias = (
Alias.filter(Alias.id > int(random.random() * max_alias_id))
.order_by(Alias.id.asc())
.limit(1)
.first()
)
contact = Contact.filter_by(alias_id=alias.id).order_by(Contact.id.asc()).first()
mailboxes = alias.mailboxes
user = alias.user
if i % 10:
print("{i} -> {alias.id}")
end = time.time()
time_taken = end - start
print(f"Took {time_taken} -> {time_taken/tests} per test")

View File

@ -0,0 +1,56 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.models import Alias, SLDomain
from app.db import Session
parser = argparse.ArgumentParser(
prog="Mark partner created aliases with the PARTNER_CREATED flag",
)
parser.add_argument(
"-s", "--start_alias_id", default=0, type=int, help="Initial alias_id"
)
parser.add_argument("-e", "--end_alias_id", default=0, type=int, help="Last alias_id")
args = parser.parse_args()
alias_id_start = args.start_alias_id
max_alias_id = args.end_alias_id
if max_alias_id == 0:
max_alias_id = Session.query(func.max(Alias.id)).scalar()
print(f"Updating aliases from {alias_id_start} to {max_alias_id}")
domains = SLDomain.filter(SLDomain.partner_id.isnot(None)).all()
cond = [f"email like '%{domain.domain}'" for domain in domains]
sql_or_cond = " OR ".join(cond)
sql = f"UPDATE alias set flags = (flags | :flag) WHERE id >= :start and id<:end and flags & :flag = 0 and ({sql_or_cond})"
print(sql)
step = 1000
updated = 0
start_time = time.time()
for batch_start in range(alias_id_start, max_alias_id, step):
updated += Session.execute(
sql,
{
"start": batch_start,
"end": batch_start + step,
"flag": Alias.FLAG_PARTNER_CREATED,
},
).rowcount
elapsed = time.time() - start_time
time_per_alias = elapsed / (batch_start - alias_id_start + step)
last_batch_id = batch_start + step
remaining = max_alias_id - last_batch_id
time_remaining = (max_alias_id - last_batch_id) * time_per_alias
hours_remaining = time_remaining / 3600.0
percent = int(
((batch_start - alias_id_start) * 100) / (max_alias_id - alias_id_start)
)
print(
f"\rAlias {batch_start}/{max_alias_id} {percent}% {updated} updated {hours_remaining:.2f}hrs remaining"
)
print(f"Updated aliases up to {max_alias_id}")

View File

@ -0,0 +1,53 @@
#!/usr/bin/env python3
import argparse
import time
from app import config
from app.email_utils import generate_reply_email
from app.email_validation import is_valid_email
from app.models import Alias
from app.db import Session
parser = argparse.ArgumentParser(
prog=f"Replace {config.NOREPLY}",
description=f"Replace {config.NOREPLY} from contacts reply email",
)
args = parser.parse_args()
el_query = "SELECT id, alias_id, website_email from contact where id>=:last_id AND reply_email=:reply_email ORDER BY id ASC LIMIT :step"
update_query = "UPDATE contact SET reply_email=:reply_email WHERE id=:contact_id "
updated = 0
start_time = time.time()
step = 100
last_id = 0
print(f"Replacing contacts with reply_email={config.NOREPLY}")
while True:
rows = Session.execute(
el_query, {"last_id": last_id, "reply_email": config.NOREPLY, "step": step}
)
loop_updated = 0
for row in rows:
contact_id = row[0]
alias_id = row[1]
last_id = contact_id
website_email = row[2]
contact_email_for_reply = website_email if is_valid_email(website_email) else ""
alias = Alias.get(alias_id)
if alias is None:
print(f"CANNOT find alias {alias_id} in database for contact {contact_id}")
reply_email = generate_reply_email(contact_email_for_reply, alias)
print(
f"Replacing contact {contact_id} with {website_email} reply_email for {reply_email}"
)
Session.execute(
update_query, {"contact_id": row[0], "reply_email": reply_email}
)
Session.commit()
updated += 1
loop_updated += 1
elapsed = time.time() - start_time
print(f"\rContact {last_id} done")
if loop_updated == 0:
break
print("")

45
proto/event.proto Normal file
View File

@ -0,0 +1,45 @@
syntax = "proto3";
package simplelogin_events;
message UserPlanChange {
uint32 plan_end_time = 1;
}
message UserDeleted {
}
message AliasCreated {
uint32 alias_id = 1;
string alias_email = 2;
string alias_note = 3;
bool enabled = 4;
}
message AliasStatusChange {
uint32 alias_id = 1;
string alias_email = 2;
bool enabled = 3;
}
message AliasDeleted {
uint32 alias_id = 1;
string alias_email = 2;
}
message EventContent {
oneof content {
UserPlanChange user_plan_change = 1;
UserDeleted user_deleted = 2;
AliasCreated alias_created = 3;
AliasStatusChange alias_status_change = 4;
AliasDeleted alias_deleted = 5;
}
}
message Event {
uint32 user_id = 1;
string external_user_id = 2;
uint32 partner_id = 3;
EventContent content = 4;
}

View File

@ -14,12 +14,14 @@ exclude = '''
| build
| dist
| migrations # migrations/ is generated by alembic
| app/events/generated
)/
)
'''
[tool.ruff]
ignore-init-module-imports = true
exclude = [".venv", "migrations", "app/events/generated"]
[tool.djlint]
indent = 2

24
scripts/generate-proto-files.sh Executable file
View File

@ -0,0 +1,24 @@
#!/bin/bash
set -euxo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" || exit 1; pwd -P)"
REPO_ROOT=$(echo "${SCRIPT_DIR}" | sed 's:scripts::g')
DEST_DIR="${REPO_ROOT}/app/events/generated"
PROTOC=${PROTOC:-"protoc"}
if ! eval "${PROTOC} --version" &> /dev/null ; then
echo "Cannot find $PROTOC"
exit 1
fi
rm -rf "${DEST_DIR}"
mkdir -p "${DEST_DIR}"
pushd $REPO_ROOT || exit 1
eval "${PROTOC} --proto_path=proto --python_out=\"${DEST_DIR}\" --pyi_out=\"${DEST_DIR}\" proto/event.proto"
popd || exit 1

View File

@ -228,6 +228,8 @@ def load_user(alternative_id):
sentry_sdk.set_user({"email": user.email, "id": user.id})
if user.disabled:
return None
if not user.is_active():
return None
return user

0
tasks/__init__.py Normal file
View File

View File

@ -0,0 +1,19 @@
import arrow
from app import s3
from app.log import LOG
from app.models import BatchImport
def cleanup_old_imports(oldest_allowed: arrow.Arrow):
LOG.i(f"Deleting imports older than {oldest_allowed}")
for batch_import in (
BatchImport.filter(BatchImport.created_at < oldest_allowed).yield_per(500).all()
):
LOG.i(
f"Deleting batch import {batch_import} with file {batch_import.file.path}"
)
file = batch_import.file
if file is not None:
s3.delete(file.path)
BatchImport.delete(batch_import.id, commit=True)

24
tasks/cleanup_old_jobs.py Normal file
View File

@ -0,0 +1,24 @@
import arrow
from sqlalchemy import or_, and_
from app import config
from app.db import Session
from app.log import LOG
from app.models import Job, JobState
def cleanup_old_jobs(oldest_allowed: arrow.Arrow):
LOG.i(f"Deleting jobs older than {oldest_allowed}")
count = Job.filter(
or_(
Job.state == JobState.done.value,
Job.state == JobState.error.value,
and_(
Job.state == JobState.taken.value,
Job.attempts >= config.JOB_MAX_ATTEMPTS,
),
),
Job.updated_at < oldest_allowed,
).delete()
Session.commit()
LOG.i(f"Deleted {count} jobs")

View File

@ -0,0 +1,12 @@
import arrow
from app.db import Session
from app.log import LOG
from app.models import Notification
def cleanup_old_notifications(oldest_allowed: arrow.Arrow):
LOG.i(f"Deleting notifications older than {oldest_allowed}")
count = Notification.filter(Notification.created_at < oldest_allowed).delete()
Session.commit()
LOG.i(f"Deleted {count} notifications")

View File

@ -38,11 +38,21 @@
<span>or</span>
</div>
<a class="btn btn-primary btn-block mt-2 proton-button"
href="{{ url_for("auth.proton_login", next=next_url) }}">
href="{{ url_for('auth.proton_login', next=next_url) }}">
<img class="mr-2" src="/static/images/proton.svg" />
Log in with Proton
</a>
{% endif %}
{% if connect_with_oidc %}
<div class="text-center my-2 text-gray">
<span>or</span>
</div>
<a class="btn btn-primary btn-block mt-2 btn-social"
href="{{ url_for('auth.oidc_login', next=next_url) }}">
<i class="fa {{ connect_with_oidc_icon }}"></i> Log in with SSO
</a>
{% endif %}
</div>
</div>
<div class="text-center text-muted mt-2">

View File

@ -15,7 +15,7 @@
{{ otp_token_form.csrf_token }}
<input type="hidden" name="form-name" value="create" />
<div class="font-weight-bold mt-5">Token</div>
<div class="small-text mb-3">Please enter the 2FA code from your 2FA authenticator</div>
<div class="small-text mb-3">Please enter the 2FA code from your authenticator app</div>
{{ otp_token_form.token(class="form-control", autofocus="true") }}
{{ render_field_errors(otp_token_form.token) }}
<div class="form-check">

View File

@ -9,7 +9,7 @@
<h1 class="card-title">Create new account</h1>
<div class="form-group">
<label class="form-label">Email address</label>
{{ form.email(class="form-control", type="email", placeholder="YourName@protonmail.com") }}
{{ form.email(class="form-control", type="email", placeholder="username@proton.me") }}
<div class="small-text alert alert-info" style="margin-top: 1px">
Emails sent to your alias will be forwarded to this email address.
<br>
@ -50,11 +50,21 @@
<span>or</span>
</div>
<a class="btn btn-primary btn-block mt-2 proton-button"
href="{{ url_for("auth.proton_login", next=next_url) }}">
href="{{ url_for('auth.proton_login', next=next_url) }}">
<img class="mr-2" src="/static/images/proton.svg" />
Sign up with Proton
</a>
{% endif %}
{% if connect_with_oidc %}
<div class="text-center my-2 text-gray">
<span>or</span>
</div>
<a class="btn btn-primary btn-block mt-2 btn-social"
href="{{ url_for('auth.oidc_login', next=next_url) }}">
<i class="fa {{ connect_with_oidc_icon }}"></i> Sign up with SSO
</a>
{% endif %}
</div>
</form>
<div class="text-center text-muted mb-6">

View File

@ -7,6 +7,7 @@
<div class="card-body p-6 text-center">
<h1 class="h4">An email to validate your email is on its way.</h1>
<p>Please check your inbox/spam folder.</p>
<p>Make sure to mark the message as not spam so that future messages come to your normal inbox</p>
</div>
</div>
{% endblock %}

View File

@ -0,0 +1,164 @@
{% extends "default.html" %}
{% set active_page = "setting" %}
{% block title %}Settings{% endblock %}
{% block head %}
<style>
.card-title {
font-size: 22px;
font-weight: 600;
margin-bottom: 3px;
}
.highlighted{
border: solid 2px #5675E2;
}
li {
margin-top: 8px;
}
</style>
{% endblock %}
{% block default_content %}
<div class="col pb-3">
<!-- Change email -->
<div class="card">
<div class="card-body">
<form method="post" enctype="multipart/form-data">
<input type="hidden" name="form-name" value="update-email">
{{ change_email_form.csrf_token }}
<div class="card-title">Account Email</div>
<div class="mb-3">
This email address is used to log in to SimpleLogin.
<br />
If you want to change the mailbox that emails are forwarded to, use the
<a href="{{ url_for('dashboard.mailbox_route') }}">
<i class="fe fe-inbox"></i> Mailboxes page
</a>
instead.
</div>
<div class="form-group mt-2">
<!-- Not allow user to change email if there's a pending change -->
{{ change_email_form.email(class="form-control", value=current_user.email, readonly=pending_email != None) }}
{{ render_field_errors(change_email_form.email) }}
</div>
<button class="btn btn-outline-primary">Change Email</button>
</form>
{% if pending_email %}
<div class="mt-2">
<span class="text-danger float-left">Pending email change: {{ pending_email }}</span>
<form method="POST"
action="{{ url_for('dashboard.resend_email_change') }}"
class="float-left ml-2">
{{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()"
class="btn btn-secondary btn-sm">Resend confirmation email</a>
</form>
<form method="POST"
action="{{ url_for('dashboard.cancel_email_change') }}"
class="float-left ml-2">
{{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()"
class="btn btn-secondary btn-sm">Cancel email change</a>
</form>
</div>
{% endif %}
</div>
</div>
<!-- END Change email -->
<!-- Change password -->
<div class="card" id="change_password">
<div class="card-body">
<div class="card-title">Password</div>
<div class="mb-3">You will receive an email containing instructions on how to change your password.</div>
<form method="post">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="change-password">
<button class="btn btn-outline-primary">Change password</button>
</form>
</div>
</div>
<!-- END Change password -->
<!-- TOTP -->
<div class="card" id="totp">
<div class="card-body">
<div class="card-title">Two Factor Authentication</div>
<div class="mb-3">
Secure your account with 2FA, you'll be asked for a code generated through an app when you login.
<br />
</div>
{% if not current_user.enable_otp %}
<a href="{{ url_for('dashboard.mfa_setup') }}"
class="btn btn-outline-primary">Setup TOTP</a>
{% else %}
<a href="{{ url_for('dashboard.mfa_cancel') }}"
class="btn btn-outline-danger">Disable TOTP</a>
{% endif %}
</div>
</div>
<!-- END TOTP -->
<!-- WebAuthn -->
<div class="card">
<div class="card-body">
<div class="card-title">Security Key (WebAuthn)</div>
<div class="mb-3">
You can secure your account by linking either your FIDO-supported physical key such as Yubikey, Google
Titan,
or a device with appropriate hardware to your account.
</div>
{% if current_user.fido_uuid is none %}
<a href="{{ url_for('dashboard.fido_setup') }}"
class="btn btn-outline-primary">Setup WebAuthn</a>
{% else %}
<a href="{{ url_for('dashboard.fido_manage') }}"
class="btn btn-outline-info">Manage WebAuthn</a>
{% endif %}
</div>
</div>
<!-- END WebAuthn -->
<!-- data export -->
<div class="card">
<div class="card-body">
<div class="card-title">SimpleLogin data export</div>
<div class="mb-3">
As per GDPR (General Data Protection Regulation) law, you can request a copy of your data which are stored on
SimpleLogin.
A zip file that contains all information will be sent to your SimpleLogin account address.
</div>
<div class="d-flex">
<div>
<form method="post">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="send-full-user-report">
<button class="btn btn-outline-info">Request your data</button>
</form>
</div>
</div>
</div>
</div>
<!-- END data export -->
<!-- Delete account -->
<div class="card">
<div class="card-body">
<div class="card-title">Account Deletion</div>
<div class="mb-3">If SimpleLogin isn't the right fit for you, you can simply delete your account.</div>
<a href="{{ url_for('dashboard.delete_account') }}"
class="btn btn-outline-danger">Delete account</a>
</div>
</div>
<!-- END Delete account -->
</div>
{% endblock %}
{% block script %}
<script>
let anchor = window.location.hash;
$(anchor).addClass("highlighted")
</script>
{% endblock %}

View File

@ -59,26 +59,29 @@
</div>
</div>
</div>
<div class="row mb-5">
<div class="col-12 col-lg-6 pt-1">
<form method="post">
<input type="hidden" name="form-name" value="create" />
{{ new_contact_form.csrf_token }}
{{ new_contact_form.email(class="form-control", placeholder="First Last <email@example.com>", autofocus=True) }}
{{ render_field_errors(new_contact_form.email) }}
<div class="small-text">Where do you want to send the email?</div>
{% if can_create_contacts %}
{% if can_create_contacts %}
<button class="btn btn-primary mt-2">Create reverse-alias</button>
{% else %}
<button disabled
title="Upgrade to premium to create reverse-aliases"
class="btn btn-primary mt-2">
Create reverse-alias
</button>
{% endif %}
</form>
</div>
<div class="row mb-5">
<div class="col-12 col-lg-6 pt-1">
<form method="post">
<input type="hidden" name="form-name" value="create" />
{{ new_contact_form.csrf_token }}
{{ new_contact_form.email(class="form-control", placeholder="First Last <email@example.com>", autofocus=True) }}
{{ render_field_errors(new_contact_form.email) }}
<div class="small-text">Where do you want to send the email?</div>
{% if can_create_contacts %}
<button class="btn btn-primary mt-2">Create reverse-alias</button>
{% else %}
<button disabled
title="Upgrade to premium to create reverse-aliases"
class="btn btn-primary mt-2">
Create reverse-alias
</button>
{% endif %}
</form>
</div>
{% endif %}
<div class="col-12 col-lg-6 pt-1">
<div class="float-right d-flex">
<form method="post">

View File

@ -17,7 +17,7 @@
<b>hello@{{ FIRST_ALIAS_DOMAIN }}</b>,
<b>me@{{ FIRST_ALIAS_DOMAIN }}</b>, etc.
<br />
If you add your own domain, this restriction is removed, and you can fully customize the alias.
If you add your own domain (or subdomain), this restriction is removed, and you can fully customize the alias.
<br />
</div>
</div>

View File

@ -22,11 +22,20 @@
<p>Alternatively you can use your Proton credentials to ensure it's you.</p>
</div>
<a class="btn btn-primary btn-block mt-2 proton-button w-25"
href="{{ url_for("auth.proton_login", next=next) }}">
href="{{ url_for('auth.proton_login', next=next) }}">
<img class="mr-2" src="/static/images/proton.svg" />
Authenticate with Proton
</a>
{% endif %}
{% if connect_with_oidc %}
<div class="my-3">
<p>Alternatively you can use your SSO credentials to ensure it's you.</p>
<a class="btn btn-primary btn-block mt-2 btn-social w-25"
href="{{ url_for('auth.oidc_login', next=next) }}">
<i class="fa {{ connect_with_oidc_icon }}"></i> Authenticate with SSO
</a>
{% endif %}
</div>
</div>
</div>
{% endblock %}
{% endblock %}

View File

@ -12,7 +12,7 @@
<div class="card-body">
<h1 class="h3">Two Factor Authentication - TOTP</h1>
<p>
You will need to use a 2FA application like Google Authenticator or Authy on your phone or PC and scan the following QR Code:
You will need to use a 2FA application like Proton Pass or Aegis on your phone or PC and scan the following QR Code:
</p>
<canvas id="qr"></canvas>
<script>

View File

@ -10,7 +10,7 @@
<div>{{ notification.message | safe }}</div>
<form method="post"
class="float-right mt-3"
onsubmit="return confirm('This operation is not reversible, please confirm');">
onsubmit="return confirm('This operation is irreversible, please confirm');">
<button class="btn btn-outline-danger">Delete</button>
</form>
</div>

View File

@ -88,45 +88,6 @@
</div>
</div>
<!-- END Current plan -->
<!-- TOTP -->
<div class="card" id="totp">
<div class="card-body">
<div class="card-title">Two Factor Authentication</div>
<div class="mb-3">
Secure your account with 2FA, you'll be asked for a code generated through an app when you login.
<br />
</div>
{% if not current_user.enable_otp %}
<a href="{{ url_for('dashboard.mfa_setup') }}"
class="btn btn-outline-primary">Setup TOTP</a>
{% else %}
<a href="{{ url_for('dashboard.mfa_cancel') }}"
class="btn btn-outline-danger">Disable TOTP</a>
{% endif %}
</div>
</div>
<!-- END TOTP -->
<!-- WebAuthn -->
<div class="card">
<div class="card-body">
<div class="card-title">Security Key (WebAuthn)</div>
<div class="mb-3">
You can secure your account by linking either your FIDO-supported physical key such as Yubikey, Google
Titan,
or a device with appropriate hardware to your account.
</div>
{% if current_user.fido_uuid is none %}
<a href="{{ url_for('dashboard.fido_setup') }}"
class="btn btn-outline-primary">Setup WebAuthn</a>
{% else %}
<a href="{{ url_for('dashboard.fido_manage') }}"
class="btn btn-outline-info">Manage WebAuthn</a>
{% endif %}
</div>
</div>
<!-- END WebAuthn -->
<!-- Newsletter -->
<div class="card" id="notification">
<div class="card-body">
@ -179,52 +140,6 @@
</form>
</div>
<!-- END change name & profile picture -->
<!-- Change email -->
<div class="card">
<div class="card-body">
<form method="post" enctype="multipart/form-data">
<input type="hidden" name="form-name" value="update-email">
{{ change_email_form.csrf_token }}
<div class="card-title">Account Email</div>
<div class="mb-3">
This email address is used to log in to SimpleLogin.
<br />
If you want to change the mailbox that emails are forwarded to, use the
<a href="{{ url_for('dashboard.mailbox_route') }}">
<i class="fe fe-inbox"></i> Mailboxes page
</a>
instead.
</div>
<div class="form-group mt-2">
<!-- Not allow user to change email if there's a pending change -->
{{ change_email_form.email(class="form-control", value=current_user.email, readonly=pending_email != None) }}
{{ render_field_errors(change_email_form.email) }}
</div>
<button class="btn btn-outline-primary">Change Email</button>
</form>
{% if pending_email %}
<div class="mt-2">
<span class="text-danger float-left">Pending email change: {{ pending_email }}</span>
<form method="POST"
action="{{ url_for('dashboard.resend_email_change') }}"
class="float-left ml-2">
{{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()"
class="btn btn-secondary btn-sm">Resend confirmation email</a>
</form>
<form method="POST"
action="{{ url_for('dashboard.cancel_email_change') }}"
class="float-left ml-2">
{{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()"
class="btn btn-secondary btn-sm">Cancel email change</a>
</form>
</div>
{% endif %}
</div>
</div>
<!-- END Change email -->
<!-- Connect with Proton -->
{% if connect_with_proton %}
@ -265,32 +180,11 @@
</div>
{% endif %}
<!-- END Connect with Proton -->
<!-- Change password -->
<div class="card" id="change_password">
<div class="card-body">
<div class="card-title">Password</div>
<div class="mb-3">
You will receive an email containing instructions on how to change your password.
</div>
<form method="post">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="change-password">
<button class="btn btn-outline-primary">
Change password
</button>
</form>
</div>
</div>
<!-- END Change password -->
<!-- Random alias -->
<div id="random-alias" class="card">
<div class="card-body">
<div class="card-title">
Aliases
</div>
<div class="mt-3 mb-1">
Change the way random aliases are generated by default.
</div>
<div class="card-title">Aliases</div>
<div class="mt-3 mb-1">Change the way random aliases are generated by default.</div>
<form method="post" action="#random-alias" class="form-inline">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="change-alias-generator">
@ -306,13 +200,9 @@
on {{ AliasGeneratorEnum.uuid.name.upper() }}
</option>
</select>
<button class="btn btn-outline-primary">
Update
</button>
<button class="btn btn-outline-primary">Update</button>
</form>
<div class="mt-3 mb-1">
Select the default domain for aliases.
</div>
<div class="mt-3 mb-1">Select the default domain for aliases.</div>
<form method="post" action="#random-alias" class="form-inline">
{{ csrf_form.csrf_token }}
<input type="hidden"
@ -338,13 +228,9 @@
</option>
{% endfor %}
</select>
<button class="btn btn-outline-primary">
Update
</button>
<button class="btn btn-outline-primary">Update</button>
</form>
<div id="random-alias-suffix" class="mt-3 mb-1">
Select the default suffix generator for aliases.
</div>
<div id="random-alias-suffix" class="mt-3 mb-1">Select the default suffix generator for aliases.</div>
<form method="post" action="#random-alias-suffix" class="form-inline">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="random-alias-suffix">
@ -358,19 +244,51 @@
Random combination of {{ ALIAS_RAND_SUFFIX_LENGTH }} letter and digits
</option>
</select>
<button class="btn btn-outline-primary">
Update
</button>
<button class="btn btn-outline-primary">Update</button>
</form>
</div>
</div>
<!-- END Random alias -->
<!-- Data breach check -->
<div class="card" id="data-breach">
<div class="card-body">
<div class="card-title">Data breach monitoring</div>
<div class="mt-1 mb-3">
{% if not current_user.is_premium() %}
<div class="alert alert-info" role="alert">
This feature is only available on Premium plan.
<a href="{{ url_for('dashboard.pricing') }}"
target="_blank"
rel="noopener noreferrer">
Upgrade<i class="fe fe-external-link"></i>
</a>
</div>
{% endif %}
If enabled, we will inform you via email if one of your aliases appears in a data breach.
<br>
SimpleLogin uses <a href="https://haveibeenpwned.com/">HaveIBeenPwned</a> API for checking for data breaches.
</div>
<form method="post" action="#data-breach">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="enable_data_breach_check">
<div class="form-check">
<input type="checkbox"
id="enable_data_breach_check"
name="enable_data_breach_check"
{% if current_user.enable_data_breach_check %} checked{% endif %}
class="form-check-input">
<label for="enable_data_breach_check">Enable data breach monitoring</label>
</div>
<button type="submit" class="btn btn-outline-primary">Update</button>
</form>
</div>
</div>
<!-- END Data breach check -->
<!-- Sender Format -->
<div class="card" id="sender-format">
<div class="card-body">
<div class="card-title">
Sender Address Format
</div>
<div class="card-title">Sender Address Format</div>
<div class="mt-1 mb-3">
When your alias receives an email, say from: <b>John Wick &lt;john@wick.com&gt;</b>,
SimpleLogin forwards it to your mailbox.
@ -685,7 +603,7 @@
sender address.
<br />
If this option is enabled, the original sender addresses is stored in the email header <b>X-SimpleLogin-Envelope-From</b>
and the original From header is stored in <b>X-SimpleLogin-Original-From<b>.
and the original From header is stored in <b>X-SimpleLogin-Original-From</b>.
You can choose to display this header in your email client.
<br />
As email headers aren't encrypted, your mailbox service can know the sender address via this header.
@ -709,6 +627,7 @@
</form>
</div>
</div>
<!-- Alias import/export -->
<div class="card">
<div class="card-body">
<div class="card-title">
@ -719,52 +638,12 @@
You can also export your aliases to a readable csv format for a future batch import.
</div>
<a href="{{ url_for('dashboard.batch_import_route') }}"
class="btn btn-outline-primary">
Batch Import
</a>
class="btn btn-outline-primary">Batch Import</a>
<a href="{{ url_for('dashboard.alias_export_route') }}"
class="btn btn-outline-secondary">
Export Aliases
</a>
</div>
</div>
<div class="card">
<div class="card-body">
<div class="card-title">
SimpleLogin data export
</div>
<div class="mb-3">
As per GDPR (General Data Protection Regulation) law, you can request a copy of your data which are stored on
SimpleLogin.
A zip file that contains all information will be sent to your SimpleLogin account address.
</div>
<div class="d-flex">
<div>
<form method="post">
{{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="send-full-user-report">
<button class="btn btn-outline-info">
Request your data
</button>
</form>
</div>
</div>
</div>
</div>
<div class="card">
<div class="card-body">
<div class="card-title">
Account Deletion
</div>
<div class="mb-3">
If SimpleLogin isn't the right fit for you, you can simply delete your account.
</div>
<a href="{{ url_for('dashboard.delete_account') }}"
class="btn btn-outline-danger">
Delete account
</a>
class="btn btn-outline-secondary">Export Aliases</a>
</div>
</div>
<!-- END Alias import/export -->
</div>
{% endblock %}
{% block script %}

View File

@ -18,7 +18,7 @@
<br />
For generic questions, i.e. not related to your account, we recommend to post the question on
our
<a href="https://www.reddit.com/r/Simplelogin/">Reddit</a>
<a href="https://www.reddit.com/r/Simplelogin/">Reddit</a> or <a href="https://forum.simplelogin.io/">our official forum</a>
where our community can help answer the question
and other people with the same question can find the answer there.
</div>

View File

@ -1,17 +1,19 @@
{% extends "default.html" %}
{% set active_page = "dashboard" %}
{% block title %}Block an alias{% endblock %}
{% block title %}Deactivate an alias{% endblock %}
{% block default_content %}
<div class="card">
<div class="card-body">
<h1 class="h3">Block alias</h1>
<h1 class="h3">Deactivate alias</h1>
<p>
You are about to block the alias
You are about to deactivate the alias
<a href="mailto:{{ alias }}" target="_blank">{{ alias }}</a>
</p>
<p>After this, you will stop receiving all emails sent to this alias, please confirm.</p>
<p>
After this, you will stop receiving all emails sent to this alias, please confirm. You will always be able to re-activate it untill you will decide to delete it.
</p>
<form method="post">
<button class="btn btn-warning">Confirm</button>
</form>

View File

@ -28,7 +28,7 @@
{{ render_text("Hi") }}
{{ render_text("If you use Safari on a MacBook or iMac, you should check out our new Safari extension.") }}
{{ render_text('It can be installed on
<a href="https://apps.apple.com/app/id1494051017">App Store</a>
<a href="https://apps.apple.com/app/id6475835429">App Store</a>
. Its code is available on
<a href="https://github.com/simple-login/mac-app">GitHub</a>
.') }}

View File

@ -8,7 +8,7 @@ If you use Safari on a MacBook or iMac, you should check out our new Safari exte
It can be installed on:
https://apps.apple.com/app/id1494051017
https://apps.apple.com/app/id6475835429
As usual, let me know if you have any question by replying to this email.

View File

@ -12,7 +12,7 @@ If you want to quickly create aliases <b>without</b> going to SimpleLogin websit
(or other Chromium-based browsers like Brave or Vivaldi),
<a href="https://addons.mozilla.org/firefox/addon/simplelogin/">Firefox</a>
and
<a href="https://apps.apple.com/app/id1494051017 ">Safari</a>
<a href="https://apps.apple.com/app/id6475835429 ">Safari</a>
extension.
{% endcall %}

View File

@ -11,7 +11,7 @@ Chrome: https://chrome.google.com/webstore/detail/dphilobhebphkdjbpfohgikllaljmg
Firefox: https://addons.mozilla.org/firefox/addon/simplelogin/
Safari: https://apps.apple.com/app/id1494051017
Safari: https://apps.apple.com/app/id6475835429
You can also manage your aliases using SimpleLogin mobile apps, available at
- Play Store https://play.google.com/store/apps/details?id=io.simplelogin.android

View File

@ -43,9 +43,8 @@ Note, if you are a paying Proton Mail user, you automatically receive the premiu
{% endcall %}
{% call text() %}
For any question, feedback or feature request, please join our
<a href="https://github.com/simple-login/app/discussions">GitHub forum</a>
.
For any question or feedback, please join our <a href="https://forum.simplelogin.io/">official forum</a>.
If you want to request a feature, please submit it on our <a href="https://github.com/simple-login/app/discussions">GitHub repo</a>.
You can also join our
<a href="https://www.reddit.com/r/Simplelogin/">Reddit</a>
or follow our

View File

@ -13,7 +13,8 @@ SimpleLogin is also available on Android and iOS so you can manage your aliases
Note, if you are a paying Proton Mail user, you automatically receive the premium version of SimpleLogin.
For any question, feedback or feature request, please join our GitHub forum.
For any question or feedback, please join our official forum.
If you want to request a feature, please submit it on our GitHub repo.
You can also join our Reddit or follow our Twitter.
Best,
@ -26,7 +27,8 @@ Firefox: https://addons.mozilla.org/firefox/addon/simplelogin/
Edge: https://microsoftedge.microsoft.com/addons/detail/simpleloginreceive-sen/diacfpipniklenphgljfkmhinphjlfff
Android: https://play.google.com/store/apps/details?id=io.simplelogin.android
iOS: https://apps.apple.com/app/id1494359858
Github forum: https://github.com/simple-login/app/discussions
Github repo: https://github.com/simple-login/app/discussions
Official forum: https://forum.simplelogin.io/
Reddit: https://www.reddit.com/r/Simplelogin/
Twitter: https://twitter.com/simple_login

View File

@ -71,9 +71,10 @@ Please note that you can't create more than {{ MAX_NB_EMAIL_FREE_PLAN }} aliases
{% endif %}
{% call text() %}
For any question, feedback or feature request, please join our
<a href="https://github.com/simple-login/app/discussions">GitHub forum</a>
.
For any question or feedback,
please join our <a href="https://forum.simplelogin.io/">official forum</a>.
If you want to request a feature,
please submit it on our <a href="https://github.com/simple-login/app/discussions">GitHub repo</a>.
You can also join our
<a href="https://www.reddit.com/r/Simplelogin/">Reddit</a>
or follow our

View File

@ -26,6 +26,8 @@ No worries: all aliases you create during this period will continue to work norm
At any time, you can reach out to us by simply replying to this email.
For any question, feedback or feature request, please join our GitHub forum at https://github.com/simple-login/app/discussions
For any question or feedback, please join our official forum at https://forum.simplelogin.io/
If you want to request a feature, please submit it on our GitHub repo at https://github.com/simple-login/app/discussions
You can also join our Reddit at https://www.reddit.com/r/Simplelogin/ follow our Twitter at https://twitter.com/simplelogin

View File

@ -4,6 +4,7 @@
{{ render_text("Thank you for choosing SimpleLogin.") }}
{{ render_text("To get started, please confirm that <b>" + email + "</b> is your email address by clicking on the button below within 1 hour.") }}
{{ render_text("If it wasn't you, maybe someone entered your email by mistake. In this case you can ignore this mail.") }}
{{ render_button("Verify email", activation_link) }}
{{ render_text('Thanks,
<br />

View File

@ -4,4 +4,6 @@
Thank you for choosing SimpleLogin.
To get started, please confirm that {{email}} is your email address using this link {{activation_link}} within 1 hour.
If it wasn't you, maybe someone entered your email by mistake. In this case you can ignore this mail.
{% endblock %}

Some files were not shown because too many files have changed in this diff Show More