Storage FAQ
Space
What third-party apps are interoperable with Space?
How can I connect to Space?
How can I migrate my data from a different cloud? What about on-prem?
Data can be migrated to and from most public cloud, private cloud, or on-prem storage using Transporter.
Transporter is a powerful yet simple data mobility tool that quickly and securely migrates data from any S3-compatible storage, as long as the data source is available via endpoint URL, secret key, and access key. Transporter is accessible as a self-service tool through the portal, or you can contact us for more support with the process. See our Transporter webpage or refer to the Transporter User Guide for step-by-step instructions.
How secure is my data? Why should I trust you with my data?
Where do you store my data?
How do I use the Space interface?
Space uses a simple S3 compatible interface over HTTPS that can be used in two ways.
- Interactively through our native web GUI or third-party S3 compatible GUI or CLI client.
- Programmatically through API calls with S3 compatible SDKs or libraries with Endpoint, Access Key, and Secret Key credentials.
For a more elaborate how-to guide or for specific step-by-step instructions, see the Space User Guide.
Can I leave my data in Space but run applications on a different cloud, like Azure?
Does Space allow me to store a primary or secondary copy of my data?
What are the size requirements and limits for storing objects in Space?
Do you support immutable use cases?
What does a Public Access Mode bucket mean/do?
Public access buckets have open read access to all who can access the bucket. So, a user would need to know the URL of an object for access to it, but the only other restriction is the IP whitelist on the bucket. In this case they would not need to use any special tools for access a file in a public bucket. A simple curl or browser request would be able to access any file in the bucket. So, for example if you bucket is called anonymous, and you have a file called x.txt then the following command would work to access it:
$ curl https://anonymous.s3.rstorcloud.io/x.txt
Or
$ curl https://.s3.rstorcloud.io/anonymous/x.txt
You should note that IP whitelisting still apply though to this access. Please also be aware that, S3 is case sensitive so the case of the filenames etc matters.
Note, this only provides read access. It does not provide list, write or delete access to objects. If you go through the S3 REST interface - to do those commands/actions then you would still need key validation (and the usual S3 permissions) for access to the bucket.
How can I create a versioned bucket in Space?
How can I use bucket logging through the Space GUI?
Is data access auditing available through Space?
What types of data can be stored in Space?
What type of data consistency model do you utilize?
What are the IP address ranges?
How do I protect myself from accidental data deletion?
How durable and available is my data in Space?
What steps do you take to avoid malicious encryption?
We support encrypted connections of your data on the way in through HTTPS protocol via replications, and utilize data at rest encryption with customer managed keys.
As the user, you are also able to take steps to limit access to your data using policies that specify the users who can see specific buckets and objects. For more information on how to create policies, see the Space User Guide. You can also enable both object locking and versioning for an additional level of security.
What levels of encryption do you offer?
Transporter
Is egress from my current cloud service provider included in your price?
How are jobs prioritized? Are any preferences given?
Do I need to provide a direct connect to use Transporter?
Do I need to stand up a server or other VM to use Transporter?
What kind of performance can I expect?
How secure is my data? Can you access content in my cloud?
How is object access supported while a migration or replication job is running?
Does Transporter delete data in my source cloud after migration?
Why does the platform only give an estimate of my data saving and transfer time?
Can you guarantee that all my files will be transported successfully?
Can I transfer across the public Internet?
Do you support on-prem to cloud migration?
What sources and destinations are supported?
Can you transfer data of any size?
Can you migrate any type of object?
Developer FAQ
How can I use legal holds and retention locks on objects in Space?
How can I use bucket logging using a CLI in Space?
How can I upload files to Space with content disposition headers?
How can I make a simple key file in order to take advantage of data at rest encryption (DARE) using a customer provided key in Space?
How can I use Cyberduck for bucket versioning?
How can I partially upload a file that is already stored in Space without uploading a new copy?
How can I upload files with metadata attached?
How can I run a benchmark on services such as Space and Amazon S3?
What is Rocket-FS and what are its capabilities?
How can I set a custom bucket policy?
You may want to create a public bucket that is read only. There are three ways to do this:
Use the WebGUI to add a custom bucket policy by selecting the tool wrench “Bucket Policies” next the bucket name. Copy and paste the custom JSON policy below, using your bucketname.
Or:
Use AWS CLI tool. Save the below to a JSON file and apply.
aws s3api put-bucket-policy --bucket MyBucket --policy file://policy.json --endpoint-url MyURL
Get current bucket Policy with AWS CLI tool
aws s3api get-bucket-policy --bucket MyBucket --endpoint-url MyURL
Example of a custom JSON policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": ["s3:GetBucketLocation"],
"Effect": "Allow",
"Principal": { "AWS": ["*"] },
"Resource": ["arn:aws:s3:::bucketname"],
"Sid": ""
},
{
"Action": ["s3:GetObject"],
"Effect": "Allow",
"Principal": { "AWS": ["*"] },
"Resource": ["arn:aws:s3:::bucketname/*"],
"Sid": ""
}
]
}
For more information on the use and format of Amazon Resource Names (ARNs)
https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html.