How to Upload Large Files with Dropbox API for Python
Just a .py file with a little bit of my modification of an old StackOverflow post
The code in this post is heavily based on StackOverflow posts by Greg from Dropbox.
When you’re working on a cloud workstation (e.g. GCP or AWS), you often have to handle large data set. Although you can use cloud storage provided in GCP or AWS, I prefer Dropbox because I use them for files not only for the cloud workstation but also for other usages. While it was relatively easy to download files from Dropbox using the wget
command and the Dropbox URL, I found it difficult to upload files to Dropbox without the browser or desktop app. This is a TIL (Today I Learned) post, and I hope it helps those who run into similar problems as I.
Getting Started with Dropbox API
To use the Dropbox API for Python, you must get an API accessToken
by creating an app on the developer section of Dropbox.
You can find many posts on the web on how to do it including the following post:
Upload_Session is What You’re Looking For
The files_upload
method only supports single files up to 150 MB in size, so you have to use upload_session for larger files.
Greg from Dropbox posted a code for using sessions to upload larger files in chunks:
If you want to implement your own version of upload functions, please check the original StackOverflow posts.
Just Use This .py File
I have added a few lines to Greg’s code for more convenient use of the upload function.
You can upload your local file at <LOCAL_FILE>
to <DROPBOX_PATH>
in your Dropbox storage by using this command:
python dbu.py <DROPBOX_PATH> <LOCAL_FILE> --timeout 900 --chunk 8
, where timeout
is a timeout in seconds and chunk
is the size of upload per session in MB. I have checked 900 s is usually enough for uploading 8MB.
Also, I have added a verbose output that shows the progress of the upload along with elapsed time.
Please feel free to use my python script!