Activity Logs

Cloud Storage users can obtain logs containing information on requests sent to public containers, which files were downloaded and how many times, as well as the total amount of data downloaded from storage for a given period of time.

Logs can be obtained two ways: from the control panel and from the API. Detailed instructions for downloading and analyzing logs is given below.

Retrieving Logs from the Control Panel

In the control panel, open the Operation history tab.

Click Generate report.

It may take several minutes to generate log files, especially for extended reporting periods.

Once the operation is complete, a link will appear in the list.

The file will also be saved in the automatically-created Logs container.

Retrieving Logs from the API

You can also obtain logs by submitting a request to the API.

Example:

curl -i -XPOST https://api.selcdn.ru/v1/logs -H "X-Auth-Token: rg17f50x400a38q284dcae97186lw900" -H "X-Start-Time: 2017-05-01 00:00:00" -H "X-End-Time: 2016-06-01 00:00:00"

As you can see from the example above, this is a POST request to https://api.selcdn.ru/v1/logs. The request contains the following parameters:

  • X-Auth-Token - authentication key (for more information, see the API documentation)
  • X-Start-Time - start of the reporting period (date and time)
  • X-End-Time - end of the reporting period (date and time)

A successful API request will return a 200 OK code.

Analyzing Logs

Let's look at an example of a log entry:

2016-04-20_11:24:11     path:0000.selcdn.com/images/myimage.png        method:GET      status:200     
client:12.34.567.89, 1  agent:Mozilla/5.0 (Ma   tx:340937      rx:595

The record contains the following information:

  • date and time storage was accessed
  • path in the format "URL storage/container name/file name"
  • HTTP method used to submit the request
  • response code
  • software used to submit the command to storage and its features
  • amount of data sent (tx) and received (rx) in bytes

Retrieving Logs: Python Program Example

import requests
  
import os
import re
# Enter login and password for authentication
headers = {'X-Auth-User': 'uSeR', 'X-Auth-Key': 'Pa$$w0rd'}
# Send the request for authentication
r = requests.get('https://auth.selcdn.ru/', headers=headers)
assert r.status_code is 204
 
# Take X-Auth-Token and X-Storage-Url headers from the response and assign their values as variables
auth_token = r.headers.get('X-Auth-Token')
storage_url = r.headers.get('X-Storage-Url')
 
# Add two more headers for retrieving logs
log_headers ={'X-Auth-Token': auth_token, 'X-Start-Time': '2017-05-29 00:00:00', 'X-End-Time': '2017-06-21 00:00:00'}
 
# Request logs
r = requests.post(storage_url+'logs', headers=log_headers)

Sample Program for Analyzing Logs

Let's imagine the following: we downloaded a log file and need to find information about all of the GET requests sent to the "images" container during the reporting period.

The most basic solution in this case would be to use an in-file search to find the necessary lines. This isn't very efficient, though; log files may contain hundreds or even thousands of lines. A better solution would be to automate the log search.

Below is an example of a simple program, written in Python, which can perform the following actions:

  • check log files for lines meeting custom-defined criteria
  • retrieve the following information from log records: when and what files were downloaded and by whom
  • write this information in a standard printout
  • save this information in a specially created file
import os
import re
# Assign a regular expression for searching the log by line
line_regex = re.compile('(images\/.+?)\smethod:GET')
 
# Indicate the file where lines meeting the criteria will be copied
output_filename = os.path.normpath("output/parsed_lines.log")
# Indicate that the file will be rewritten each time the program is launched
with open(output_filename, "w") as out_file:
 out_file.write("")
 
# Open the file for writing the printout
with open(output_filename, "a") as out_file:
 # Open the log file for reading
 with open("access_1454392800-1462179600.log", "r") as in_file:
 # Search the log by line:
 for line in in_file:
 # If a line meets the criteria, extract the necessary infromation and write it in as a standard printout to our file
 if (line_regex.search(line)):
 entry = line.split('\t')
 print entry[0], entry[1],entry[2], entry[3], entry[4], entry[6], entry[7]
 out_file.write (", " .join(entry[:5]+entry[6:]))