Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added beautiful icon. #1

Open
wants to merge 12 commits into
base: master
Choose a base branch
from
2 changes: 1 addition & 1 deletion LICENSE.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2017 Marc Lijour
Copyright (c) 2017 bolstycjw

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,34 @@
# odoo-s3
# Odoo S3 Storage

## Dependencies
`odoo-s3` uses [`boto3`](https://github.com/boto/boto3) to talk to Amazon S3. You will need to install it on the host running Odoo.
`Odoo-S3-Storage` uses [`boto3`](https://github.com/boto/boto3) to talk to DigitalOcean. You will need to install it on the host running Odoo.

## Installation
Make sure you set the `ODOO_ADDONS_PATH` variable to the directory where you install your custom Odoo modules.

```
pip install boto3
cd $ODOO_ADDONS_PATH
git clone https://github.com/marclijour/odoo-s3
git clone https://github.com/HP-bkeys/odoo-s3-storage.git
```

## Compatibility
This module is compatible with **Odoo 11** and **Python 3**. For older versions, you can refer to the original source code (see credits below).

## Configuration
In order to use `odoo-s3` you will need to switch to "Developer mode" and define a new system parameter as follows:
In order to use `Odoo-S3-Storage` you will need to switch to "Developer mode" and define a new system parameter as follows:

* without encryption:
```
ir_attachment.location ---> s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>
ir_attachment.location ---> s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>&<Your-DigitalOcean-base-url>

```
* with server-side encryption (only AES256, since [aws:kms is not supported in boto3](https://github.com/boto/botocore/issues/471)):
```
ir_attachment.location ---> s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>+SSE
ir_attachment.location ---> s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>&<Your-DigitalOcean-base-url>+SSE

```
## Additional Information
This module is based on `Odoo-S3`(https://github.com/tvanesse/odoo-s3) and `Odoo-S3-Storage`(https://github.com/bolstycjw/odoo-s3-storage). The code was rewritten to work with **Odoo v10.0**, uses boto3 instead of boto, and works with DigitalOcean Spaces.

## Additional Information and Credits
This code is [forked from brolycjw's repository](https://github.com/brolycjw/odoo-s3-storage) who ported the [original code from tvanesse](https://github.com/tvanesse/odoo-s3) to Odoo v10.0, and moving from boto to boto3.

```
25 changes: 10 additions & 15 deletions __manifest__.py
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
# -*- coding: utf-8 -*-
{
'name': "odoo-s3",
'name': "S3 Storages",

'summary': """
Stores attachments in Amazon S3 instead of the local drive""",
Allows you to use a DigitalOcean Spaces bucket for file storage""",

'description': """
In large deployments, Odoo workers need to share a distributed
filestore. Amazon S3 can store files (e.g. attachments and
pictures), such that all Odoo workers can access the same files.

This module lets you configure access to an S3 bucket from Odoo,
by settings a System parameter.
Binary files such as attachments and pictures are stored by default
in the file system of the host running Odoo. In some cases you may
want to decrease the overall response time by delegating static file
storage to a specialized instance such as an DigitalOcean Spaces bucket.
This module allows you to configure Odoo so that an DigitalOcean Spaces bucket is
used instead of the file system for binary files storage.
""",

'author': "Marc Lijour",
'website': "https://github.com/marclijour/odoo-s3",
'author': "brolycjw, hp-bkeys",
'website': "http://primetechnologies.com.sg/, https://homeprotech.com/",

# Categories can be used to filter modules in modules listing
# Check https://github.com/odoo/odoo/blob/master/odoo/addons/base/module/module_data.xml
Expand All @@ -25,9 +25,4 @@

# any module necessary for this one to work correctly
'depends': ['base'],

# only the admin user should be having access -so default is ok
# 'data': [
# 'security/ir.model.access.csv',
# ],
}
15 changes: 10 additions & 5 deletions models/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,16 +33,19 @@ def _connect_to_S3_bucket(self, s3, bucket_name):
def _file_read(self, fname, bin_size=False):
storage = self._storage()
if storage[:5] == 's3://':
access_key_id, secret_key, bucket_name, encryption_enabled = s3_helper.parse_bucket_url(storage)
s3 = s3_helper.get_resource(access_key_id, secret_key)
access_key_id, secret_key, bucket_name, do_space_url = s3_helper.parse_bucket_url(
storage)
s3 = s3_helper.get_resource(
access_key_id, secret_key, do_space_url)
s3_bucket = self._connect_to_S3_bucket(s3, bucket_name)
file_exists = s3_helper.object_exists(s3, s3_bucket.name, fname)
if not file_exists:
# Some old files (prior to the installation of odoo-s3) may
# still be stored in the file system even though
# ir_attachment.location is configured to use S3
try:
read = super(S3Attachment, self)._file_read(fname, bin_size=False)
read = super(S3Attachment, self)._file_read(
fname, bin_size=False)
except Exception:
# Could not find the file in the file system either.
return False
Expand All @@ -56,8 +59,10 @@ def _file_read(self, fname, bin_size=False):
def _file_write(self, value, checksum):
storage = self._storage()
if storage[:5] == 's3://':
access_key_id, secret_key, bucket_name, encryption_enabled = s3_helper.parse_bucket_url(storage)
s3 = s3_helper.get_resource(access_key_id, secret_key)
access_key_id, secret_key, bucket_name, do_space_url = s3_helper.parse_bucket_url(
storage)
s3 = s3_helper.get_resource(
access_key_id, secret_key, do_space_url)
s3_bucket = self._connect_to_S3_bucket(s3, bucket_name)
bin_value = base64.b64decode(value)
fname = hashlib.sha1(bin_value).hexdigest()
Expand Down
27 changes: 13 additions & 14 deletions models/s3_helper.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
#!/usr/bin/env python3
"""
s3_helper.py
~~~~~~~~~~~~~~~~~
Expand All @@ -12,7 +10,7 @@

import boto3
# uncomment for debug mode:
#boto3.set_stream_logger('')
# boto3.set_stream_logger('')
import botocore
from boto3.session import Session
from boto3.s3.transfer import S3Transfer
Expand All @@ -23,7 +21,7 @@ def parse_bucket_url(bucket_url):
"Expecting an s3:// scheme, got {} instead.".format(scheme)

# scheme:
# s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>+SSE
# s3://<Your-AWS-Access-Key-ID>:<Your-AWS-Secret-Key>@<Your-S3-Bucket-name>&<Your-DigitalOcean-base-url>+SSE
# where +SSE is optional (meaning server-side encryption enabled)

try:
Expand All @@ -32,8 +30,10 @@ def parse_bucket_url(bucket_url):
access_key_id = remain.split(':')[0]
remain = remain.lstrip(access_key_id).lstrip(':')
secret_key = remain.split('@')[0]
remain = remain.lstrip(secret_key).lstrip('@').split('+')
bucket_name = remain[0]
remain = remain.lstrip(secret_key).lstrip('@')
bucket_name = remain.split('&')[0]
remain = remain.lstrip(bucket_name).lstrip('&').split('+')
do_space_url = remain[0]
encryption_enabled = len(remain) > 1

if not access_key_id or not secret_key:
Expand All @@ -44,7 +44,7 @@ def parse_bucket_url(bucket_url):
except Exception:
raise Exception("Unable to parse the S3 bucket url.")

return (access_key_id, secret_key, bucket_name, encryption_enabled)
return (access_key_id, secret_key, bucket_name, do_space_url, encryption_enabled)


def bucket_exists(s3, bucket_name):
Expand All @@ -69,21 +69,20 @@ def object_exists(s3, bucket_name, key):
return exists


def get_resource(access_key_id, secret_key):
session = Session(access_key_id, secret_key)
s3 = session.resource('s3')
def get_resource(access_key_id, secret_key, endpoint_url):
session = boto3.Session(access_key_id, secret_key)
s3 = session.resource('s3', endpoint_url='https://' + endpoint_url)
return s3

# extra: works for files stored in the file system
# (not called by models.py which only deal with in-memory)
def upload(value, storage):
access_key_id, secret_key, bucket_name, encryption_enabled = parse_bucket_url(storage)
access_key_id, secret_key, bucket_name, do_space_url, encryption_enabled = parse_bucket_url(storage)
s3 = get_resource(access_key_id, secret_key)
### S3Transfer allows multi-part, call backs etc
# http://boto3.readthedocs.io/en/latest/_modules/boto3/s3/transfer.html
transfer = S3Transfer(s3.meta.client)
if encryption_enabled:
transfer.upload_file(value, bucket_name, value, extra_args={'ServerSideEncryption': 'AES256'})
transfer.upload_file(value, bucket_name, do_space_url, value, extra_args={'ServerSideEncryption': 'AES256'})
else:
transfer.upload_file(value, bucket_name, value)

transfer.upload_file(value, bucket_name, do_space_url, value)
Binary file added static/description/icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.