Storage FAQ

Space

What third-party apps are interoperable with Space?

How can I connect to Space?

Space is available via internet or on select direct connects.

How can I migrate my data from a different cloud? What about on-prem?

Data can be migrated to and from most public cloud, private cloud, or on-prem storage using Transporter.

Transporter is a powerful yet simple data mobility tool that quickly and securely migrates data from any S3-compatible storage, as long as the data source is available via endpoint URL, secret key, and access key. Transporter is accessible as a self-service tool through the portal, or you can contact us for more support with the process. See our Transporter webpage or refer to the Transporter User Guide for step-by-step instructions.

How secure is my data? Why should I trust you with my data?

We ensure your data is extremely secure. Customers have the ability to choose the amount of redundancy across data centers and replication locations themselves. They can also choose if they want their data encrypted before entry into Space. We also support encrypted connections of your data on the way in through HTTPS protocol via replication from data center to data center, and offer data at rest encryption (DARE) with customer managed keys.

Where do you store my data?

We allow customers to choose where their data is stored.

How do I use the Space interface?

Space uses a simple S3 compatible interface over HTTPS that can be used in two ways.

  • Interactively through our native web GUI or third-party S3 compatible GUI or CLI client.
  • Programmatically through API calls with S3 compatible SDKs or libraries with Endpoint, Access Key, and Secret Key credentials.

For a more elaborate how-to guide or for specific step-by-step instructions, see the Space User Guide.

Can I leave my data in Space but run applications on a different cloud, like Azure?

Yes, Space is accessible from any internet cloud service provider.

Does Space allow me to store a primary or secondary copy of my data?

Yes, we allow customers to have multi-region replication.

What are the size requirements and limits for storing objects in Space?

There is no minimum size necessary for storing objects in Space, but the maximum object size is 5TB.

Do you support immutable use cases?

Yes, we support versioning, object locking, and bucket logging.

What does a Public Access Mode bucket mean/do?

Public access buckets have open read access to all who can access the bucket. So, a user would need to know the URL of an object for access to it, but the only other restriction is the IP whitelist on the bucket. In this case they would not need to use any special tools for access a file in a public bucket. A simple curl or browser request would be able to access any file in the bucket. So, for example if you bucket is called anonymous, and you have a file called x.txt then the following command would work to access it:

$ curl https://anonymous.s3.rstorcloud.io/x.txt

Or

$ curl https://.s3.rstorcloud.io/anonymous/x.txt

You should note that IP whitelisting still apply though to this access. Please also be aware that, S3 is case sensitive so the case of the filenames etc matters.

Note, this only provides read access. It does not provide list, write or delete access to objects. If you go through the S3 REST interface - to do those commands/actions then you would still need key validation (and the usual S3 permissions) for access to the bucket.

How can I create a versioned bucket in Space?

How can I use bucket logging through the Space GUI?

Is data access auditing available through Space?

Yes, data access auditing is available through apps themselves. Space provides access via bucket logging features and you can enable that as a means for auditing access.

What types of data can be stored in Space?

Any type of data or application that can serve data via S3 platform can be stored in Space.

What type of data consistency model do you utilize?

Space employs an eventual consistency model.

What are the IP address ranges?

We publish a list of IP address ranges in JSON format. To see our current list, download the .json file. If you want to keep track of the version history, save the different versions of the .json file to your system so you can reference the previous ones to identify the changes.

How do I protect myself from accidental data deletion?

We suggest utilizing the following strategies to help prevent accidental data deletion: set up both versioned and locked objects, enable access policies that do not allow for deletion, and limit administrative privileges to external users.

How durable and available is my data in Space?

Space offers 11 9’s durability and 99.99% availability, with multi-region replication.

What steps do you take to avoid malicious encryption?

We support encrypted connections of your data on the way in through HTTPS protocol via replications, and utilize data at rest encryption with customer managed keys.

As the user, you are also able to take steps to limit access to your data using policies that specify the users who can see specific buckets and objects. For more information on how to create policies, see the Space User Guide. You can also enable both object locking and versioning for an additional level of security.

What levels of encryption do you offer?

We support encrypted connections of your data on the way in through HTTPS protocol via replications and utilize data at rest encryption with customer managed keys. We also offer AES 256bit encryption and SHA-256 for data integrity.

Transporter

Is egress from my current cloud service provider included in your price?

Customers are responsible for their own egress fees.

How are jobs prioritized? Are any preferences given?

Jobs are queued for execution on a first come, first served basis. We can handle many parallel transfers simultaneously for each cloud provider so you will not need to wait for a job to end before another is started.

Do I need to provide a direct connect to use Transporter?

No, we can leverage our own connections to CSPs. Transporter is delivered as SaaS, meaning you don’t need to provide any kind of special connectivity or network equipment to use it.

Do I need to stand up a server or other VM to use Transporter?

No, Transporter is a service. Sign up or log in to your account, select the source and enter the credentials for the buckets you want to migrate or replicate, choose the destination and your job will be queued for execution.

What kind of performance can I expect?

Performance depends on the type of connectivity into the CSP, and if the source or destination do any rate-limiting

How secure is my data? Can you access content in my cloud?

Data in transit is always encrypted. We never have access to your credentials like the access key and secret key.

How is object access supported while a migration or replication job is running?

Any object that is written to the destination is instantly visible in the destination bucket and can be accessed the same as any other data.

Does Transporter delete data in my source cloud after migration?

This is a feature that can be enabled in the future.

Why does the platform only give an estimate of my data saving and transfer time?

Until we do a full bucket scan, we don’t know the complexity of all objects to be moved. Billions of small objects will have different characteristics than a few large objects.

Can you guarantee that all my files will be transported successfully?

Yes, we verify that every object sent is successfully received, and provide a log to support this as well.

Can I transfer across the public Internet?

Yes. Transporter does not require any special networking. Performance will be impacted by the network connection bandwidth. The CSPs may also offer discounted egress rates when transfers are initiated using direct connects.

Do you support on-prem to cloud migration?

Yes, if the on-prem source is an S3-compatible object storage, then it is supported.

What sources and destinations are supported?

Transporter can migrate data to and from any S3-compatible storage.

Can you transfer data of any size?

Yes, we can transfer data of any size

Can you migrate any type of object?

Yes, we can move any unstructured data.

Developer FAQ

How can I use legal holds and retention locks on objects in Space?

How can I use bucket logging using a CLI in Space?

How can I upload files to Space with content disposition headers?

How can I make a simple key file in order to take advantage of data at rest encryption (DARE) using a customer provided key in Space?

How can I use Cyberduck for bucket versioning?

How can I partially upload a file that is already stored in Space without uploading a new copy?

How can I upload files with metadata attached?

How can I run a benchmark on services such as Space and Amazon S3?

What is Rocket-FS and what are its capabilities?

How can I set a custom bucket policy?

You may want to create a public bucket that is read only. There are three ways to do this:

Use the WebGUI to add a custom bucket policy by selecting the tool wrench “Bucket Policies” next the bucket name. Copy and paste the custom JSON policy below, using your bucketname.

Or:

Use AWS CLI tool. Save the below to a JSON file and apply.
aws s3api put-bucket-policy --bucket MyBucket --policy file://policy.json --endpoint-url MyURL

Get current bucket Policy with AWS CLI tool
aws s3api get-bucket-policy --bucket MyBucket --endpoint-url MyURL

Example of a custom JSON policy:

{

  "Version": "2012-10-17",

  "Statement": [

  {

    "Action": ["s3:GetBucketLocation"],

      "Effect": "Allow",

      "Principal": { "AWS": ["*"] },

      "Resource": ["arn:aws:s3:::bucketname"],

      "Sid": ""

    },

    {

      "Action": ["s3:GetObject"],

      "Effect": "Allow",

      "Principal": { "AWS": ["*"] },

      "Resource": ["arn:aws:s3:::bucketname/*"],

      "Sid": ""

    }

  ]

}

For more information on the use and format of Amazon Resource Names (ARNs)

https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html.