If you use Google Cloud Storage, you can exchange files between magnews and a GCS bucket, making it easier to manage imports and exports.
At the moment, you can use Google Cloud Storage in scheduled tasks to import and export files. In the future, it will also be available in other areas of the platform.
To use Google Cloud Storage in scheduled activities, you need to configure three elements:
- a digital certificate with the Google Cloud service account key
- an external file server of type Google Cloud Storage
- a scheduled task that uses that file server to read or write files
The integration is designed to make Google Cloud Storage available within the same operational flow you already use for other remote servers, so you can manage imports and exports in a simpler and more consistent way. For example, you can:
- import files from a Google Cloud Storage bucket into magnews
- export files from magnews to a Google Cloud Storage bucket.
Create the digital certificate for Google Cloud Storage
To connect Google Cloud Storage to magnews, you need a JSON key from a Google Cloud service account with the appropriate permissions on the bucket you want to use (the specific permit is https://www.googleapis.com/auth/devstorage.read_write).
Where to get the certificate content
The JSON authentication file must be created in the Google Cloud Console by generating a key for a service account (Service Account Key).
To learn more about key management and authentication, refer to the official Google documentation (IAM documentation – Key management).
How to create the certificate in magnews
Go to API & Integrations > Digital certificates and create a new certificate.
Fill in the fields:
- Name: enter a descriptive name to easily identify the certificate
- Certificate ID: optional
- Type of certificate: select Google Cloud Service key
- Content of certificate: paste the full content of the service account JSON file.
Magnews performs a basic validation on the certificate content to verify that it includes the required data.
Create the Google Cloud Storage external file server
After creating the certificate, you can configure the connection to the bucket.
Google Cloud Storage is not a traditional FTP server. For this reason, in magnews it is configured as an external file server, but with specific fields different from FTP or SFTP.
Go to FTP > External file servers and create a new file server.
Fill in the available fields:
- Server name: descriptive name
- Key identifier: unique server identifier (optional)
- Server type: select Google Cloud Storage
- Bucket: enter the name of the Google Cloud Storage bucket you want to connect (the bucket must already exist on GCS)
- Certificate: select the GCS digital certificate created earlier
For this type of configuration, you do not need to fill in typical FTP or SFTP fields such as host, port, username, or password, because Google Cloud Storage uses a different authentication system.
Use Google Cloud Storage in scheduled tasks
Once the external file server is configured, you can use it in scheduled activities.
To import files from Google Cloud Storage
If you need to import contacts or data tables into magnews, in scheduled tasks you can select Google Cloud Storage as the data source by choosing the configured file server. This allows you to read files stored in the bucket and use them as the source for the import.
You can specify a file prefix to filter files in Google Cloud Storage. In fact, GCS is not a traditional file server, but uses file prefixes to simulate directories.
To export files to Google Cloud Storage
If you need to export data from magnews, in scheduled tasks you can select Google Cloud Storage as the file destination.
In this case, choose:
- the Google Cloud Storage file server
- the optional file prefix, which is the path or initial name to use when saving the file in the bucket
What to check if it doesn’t work
If the configuration does not work as expected, in most cases it is useful to check the following:
- Is the JSON file pasted into magnews (certificate content) complete and correct?
- Does the service account have the correct permissions on the Google Cloud Storage bucket?
- Has the bucket name been entered correctly in the external file server?
- Does the scheduled task use the correct file server and correctly specify the prefix to use?