Activity Logs

Cloud Storage users can obtain logs containing information on requests sent to public containers, which files were downloaded and how many times, as well as the total amount of data downloaded from storage for a given period of time.

Logs can be obtained two ways: from the control panel and from the API. Detailed instructions for downloading and analyzing logs is given below.

Retrieving Logs from the Control Panel

In the control panel, open the Operation history tab. Click Generate report. It may take several minutes to generate log files, especially for extended reporting periods. Once the operation is complete, a link will appear in the list. The file will also be saved in the automatically-created Logs container.

Retrieving Logs from the API

You can also obtain logs by submitting a request to the API.

Example:

curl -i -XPOST https://api.selcdn.ru/v1/logs -H "X-Auth-Token: rg17f50x400a38q284dcae97186lw900" -H "X-Start-Time: 2017-05-01 00:00:00" -H "X-End-Time: 2016-06-01 00:00:00"

As you can see from the example above, this is a POST request to https://api.selcdn.ru/v1/logs. The request contains the following parameters:

  • X-Auth-Token - authentication key (for more information, see the API documentation)
  • X-Start-Time - start of the reporting period (date and time)
  • X-End-Time - end of the reporting period (date and time)

A successful API request will return a 200 OK code.

Retrieving Logs: Python Program Example

import requests
  
import os
import re
# Enter login and password for authentication
headers = {'X-Auth-User': 'uSeR', 'X-Auth-Key': 'Pa$$w0rd'}
# Send the request for authentication
r = requests.get('https://auth.selcdn.ru/', headers=headers)
assert r.status_code is 204
 
# Take X-Auth-Token and X-Storage-Url headers from the response and assign their values as variables
auth_token = r.headers.get('X-Auth-Token')
storage_url = r.headers.get('X-Storage-Url')
 
# Add two more headers for retrieving logs
log_headers ={'X-Auth-Token': auth_token, 'X-Start-Time': '2017-05-29 00:00:00', 'X-End-Time': '2017-06-21 00:00:00'}
 
# Request logs
r = requests.post(storage_url+'logs', headers=log_headers)

If the request is successfully processed, a report will be added to the Logs container.

Analyzing Logs

Let's look at an example of a log entry:

2016-04-20_11:24:11     path:0000.selcdn.com/images/myimage.png        method:GET      status:200     
client:12.34.567.89, 1  agent:Mozilla/5.0 (Ma   tx:340937      rx:595

The record contains the following information:

  • date and time storage was accessed
  • path in the format "URL storage/container name/file name"
  • HTTP method used to submit the request
  • response code
  • software used to submit the command to storage and its features
  • amount of data sent (tx) and received (rx) in bytes

Sample Program for Analyzing Logs

Let's say we retrieve a log file and need to count the number of objects downloaded from our containers. This process can be easily automated.

Below is an example of a simple program, written in Python, which can perform the following actions:

  • identify the lines in a log file referring to successful GET requests to objects or object lists
  • count the number of these requests to objects and object lists
  • return a standard printout with the results
import argparse
from collections import Counter
 
def main():
    parser = argparse.ArgumentParser(
        description='Process storage access log files.')
    parser.add_argument('path_to_file', help='Path to logfile')
    args = parser.parse_args()
 
    stats = Counter()
 
    with open(args.path_to_file, "r") as logfile:
        for line in logfile:
            date, path, method, status, ip, agent, tx, rx = line.split('\t')
            if method.endswith('GET') and status.endswith('200'):
                stats[path.lstrip('path:')] += 1
    for objpath, downloads in stats.most_common():
        print('{}: {}'.format(objpath, downloads))
 
if __name__ == "__main__":
    main()

If you would like to use this program, copy the program text to the file parse_logs.py and run the following command:

python parse_logs.py /path/to/logfile.log