Class: S3Rotate::S3Client
- Inherits:
-
Object
- Object
- S3Rotate::S3Client
- Includes:
- Logging
- Defined in:
- lib/s3_rotate/aws/s3_client.rb
Instance Attribute Summary collapse
-
#access_key ⇒ Object
attributes.
-
#access_secret ⇒ Object
Returns the value of attribute access_secret.
-
#bucket_name ⇒ Object
Returns the value of attribute bucket_name.
-
#connection ⇒ Object
Get the S3 connection.
-
#region ⇒ Object
Returns the value of attribute region.
Instance Method Summary collapse
-
#bucket ⇒ Object
Get the S3 bucket.
-
#copy(backup_name, file, type) ⇒ Object
Copy an existing file on AWS S3.
-
#exists?(backup_name, backup_date, type, extension = nil) ⇒ Boolean
Check if a remote backup exists.
-
#initialize(key, secret, bucket, region) ⇒ Object
constructor
Initialize a new S3Client instance.
-
#remote_backups(backup_name, type) ⇒ Object
Get the list of remote backups for a specific ‘backup_name` and `type`.
-
#upload(backup_name, backup_date, type, extension, data) ⇒ Object
Upload raw data to AWS S3.
Methods included from Logging
Constructor Details
#initialize(key, secret, bucket, region) ⇒ Object
Initialize a new S3Client instance.
30 31 32 33 34 35 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 30 def initialize(key, secret, bucket, region) @access_key = key @access_secret = secret @bucket_name = bucket @region = region end |
Instance Attribute Details
#access_key ⇒ Object
attributes
14 15 16 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 14 def access_key @access_key end |
#access_secret ⇒ Object
Returns the value of attribute access_secret.
15 16 17 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 15 def access_secret @access_secret end |
#bucket_name ⇒ Object
Returns the value of attribute bucket_name.
16 17 18 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 16 def bucket_name @bucket_name end |
#connection ⇒ Object
Get the S3 connection.
51 52 53 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 51 def connection @connection end |
#region ⇒ Object
Returns the value of attribute region.
17 18 19 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 17 def region @region end |
Instance Method Details
#bucket ⇒ Object
Get the S3 bucket.
42 43 44 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 42 def bucket @bucket ||= connection.directories.get(bucket_name) end |
#copy(backup_name, file, type) ⇒ Object
Copy an existing file on AWS S3
108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 108 def copy(backup_name, file, type) logger.info("copying #{file.key} to #{backup_name}/#{type}/#{file.key.split('/').last}") # can't copy files >5GB # need to download them first, then re-upload them using multipart upload # also download them to disk to prevent exceeding memory limits open("s3_rotate.download.tmp", 'w+') do |f| bucket.files.get(file.key) do |chunk,remaining_bytes,total_bytes| f.write chunk end end # 104857600 bytes => 100 megabytes remote_file = bucket.files.create(key: "#{backup_name}/#{type}/#{file.key.split('/').last}", body: File.open("s3_rotate.download.tmp"), multipart_chunk_size: 104857600) # cleanup File.delete("s3_rotate.download.tmp") # return remote file remote_file end |
#exists?(backup_name, backup_date, type, extension = nil) ⇒ Boolean
Check if a remote backup exists
77 78 79 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 77 def exists?(backup_name, backup_date, type, extension=nil) connection.directories.get(bucket_name, prefix: "#{backup_name}/#{type}/#{backup_date.to_s}#{extension}").files.any? end |
#remote_backups(backup_name, type) ⇒ Object
Get the list of remote backups for a specific ‘backup_name` and `type`
63 64 65 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 63 def remote_backups(backup_name, type) connection.directories.get(bucket_name, prefix: "#{backup_name}/#{type}") end |
#upload(backup_name, backup_date, type, extension, data) ⇒ Object
Upload raw data to AWS S3
92 93 94 95 96 97 |
# File 'lib/s3_rotate/aws/s3_client.rb', line 92 def upload(backup_name, backup_date, type, extension, data) logger.info("uploading #{backup_name}/#{type}/#{backup_date.to_s}#{extension}") # 104857600 bytes => 100 megabytes bucket.files.create(key: "#{backup_name}/#{type}/#{backup_date.to_s}#{extension}", body: data, multipart_chunk_size: 104857600) end |