目录
此内容是否有帮助?

# Ruby SDK User Guide

This guide will show you how to access your project using the Ruby SDK. You can get the source code of the Ruby SDK at GitHub (opens new window).

**The latest version is **: 1.2.0

**Update time is **: 2020-08-28

# I. Integrate SDK

# 1.1 Install SDK

Please use the gemcommand to get the SDK package.

# install SDK
gem install thinkingdata-ruby

# 1.2 Create an SDK Instance

First introduce thinkingdata-rubyat the beginning of the code file:

require 'thinkingdata-ruby'

To upload data using the SDK, you first need to create a TDAnalytics:: Trackerobject. TDAnalytics:: Trackeris the core class of data reporting, which can be used to report event data and user feature data. Creating a Trackerobject requires an incoming consumerobject, and the consumerdetermines how the data is reported (stored in a local log file or uploaded to the server level).

Create Tracker Object
ta = TDAnalytics::Tracker.new(consumer)

# report data
ta.track('your_event', distinct_id: 'distinct_id_of_user')

TDAnalyticsprovides three consumerimplementations:

**(1) LoggerConsumer **: Write data to local files in real time, the files are split in days/hours, and need to be used with LogBus for data upload, it is recommended to use in production environments.

# Files written to the current directory by default are named daily by date, such as: tda.log.2019-11-15
consumer = TDAnalytics::LoggerConsumer.new

# You can also modify the configuration, which creates the LoggerConsumer and writes the data: /path/to/log/demolog.2019-11-15-18 (18 为小时)
consumer = TDAnalytics::LoggerConsumer.new('/path/to/log', 'hourly', prefix: 'demolog')

**(2) DebugConsumer **: Transmit data to the TA server in real time one by one, and return detailed error messages when the data format is wrong. It is recommended to use DebugConsumer to verify the data format first. Initializes the incoming project APP ID and the receiving end address. Be careful not to use it in production environments

# create DebugConsumer
consumer = TDAnalytics::DebugConsumer.new(SERVER_URL, YOUR_APPID)

If you do not want the data to be stored, but only want to verify the data format, you can initialize the code as follows:

# create DebugConsumer
consumer = TDAnalytics::DebugConsumer.new(SERVER_URL, DEMO_APPID,false)

SERVER_URLURL for data transfer, YOUR_APPIDAPP ID for your project

If you are using Cloud as a Service, enter the following URL:

http://receiver.ta.thinkingdata.cn/

If you are using the version of private deployment, enter the following URL:

http://Data Acquisition Address

**(3) BatchConsumer **: batch real-time transmission of data to the TA server, do not need to match the transmission tool. In the case of poor network conditions, it may cause data loss, so it is not recommended to use it extensively in production environments. Initialize the incoming project APP ID and the receiving end address.

BatchConsumerwill first store the data in the buffer, and when the number of data pieces exceeds the set buffer maximum value (max_buffer_length, the default is 20), the report will be triggered. You can also configure the buffer size by passing in integer parameters when initializing the SDK:

 # BatchConsumer,data will be stored in the buffer first and reported when the specified number of items is reached. Default is 20 items.
 consumer = TDAnalytics::BatchConsumer.new(SERVER_URL, YOUR_APPID)

 # Create a BatchConsumer with 3 specified buffer sizes
 #consumer = TDAnalytics::BatchConsumer.new(SERVER_URL, YOUR_APPID, 3)

 #Set whether to compress data or not. The default true represents gzip compression. The Intranet can be set this way
 #consumer = TDAnalytics::BatchConsumer.new(SERVER_URL, YOUR_APPID)
 #consumer._set_compress(false)

SERVER_URLURL for data transfer, YOUR_APPIDAPP ID for your project

If you are using Cloud as a Service, enter the following URL:

http://receiver.ta.thinkingdata.cn/

If you are using the version of private deployment, enter the following URL:

http://Data Acquisition Address/

Note: For version 1.0.0, enter the following URL:

http://receiver.ta.thinkingdata.cn/logagent
http://Data Acquisition Address/logagent

You can also pass in your own implementation of Consumer by implementing the following interfaces:

  • Add (message): (must) accept data objects of type Hash
  • Flush: (optional) send the data of the buffer to the specified address
  • Close: (optional) When the program exits, the user can actively call this interface to ensure safe exit

# II. Report Data

After the SDK is initialized, the following interfaces can be used to report data.

# 2.1 Send Events

You can call trackto upload events. It is recommended that you set the attributes of the event and the conditions for sending information according to the pre-combed doc. The upload event example is as follows:

# Define event data
event = {
  # Event name (required)
  event_name: 'test_event',
  # Account ID (optional)
  account_id: 'ruby_test_aid',
  # Visitor ID (optional), account ID and visitor ID can not be emptyV
  distinct_id: 'ruby_distinct_id',
  # Event time (optional) If not filled in, the time at which the interface was called will be used as the event time
  time: Time.now,
  # Event IP (optional) When an IP address is passed in, the background can resolve the location
  ip: '202.38.64.1',
  # Event attributes (optional)
  properties: {
    prop_date: Time.now,
    prop_double: 134.1,
    prop_string: 'hello world',
    prop_bool: true,
  },
  # Skip local format checksum (optional)
  # skip_local_check: true,
}

# Upload Event
ta.track(event)

Parameter description:

  • The name of the event can only start with a letter and can contain numbers, letters and an underscore "_". The length is up to 50 characters and is not sensitive to letter case.
  • The attributes of the event are of type Hash, where each element represents an attribute.
  • The key value of the event property is the name of the property, which is of Stringtype. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to letter case.
  • The value of the event property is the value of the property, supporting String, numeric type, bool, Time, array type.

The SDK will verify the data format locally. If you want to skip the local verification, you can pass in skip_local_checkparameters when calling the trackinterface.

# 2.2 Set Public Event Properties

The public event property is the property that each event will contain. The normal public property is the fixed value. You can also set the dynamic public property, which will take the value when the event is reported and added to the event. If you have the same property, the dynamic public property overrides the public event property.

# Define common attributes
super_properties = {
  super_string: 'super_string',
  super_int: 1,
  super_bool: false,
  super_date: Time.rfc2822("Thu, 26 Oct 2019 02:26:12 +0545")
}

# Set common event attributes, which are added to each event
ta.set_super_properties(super_properties)

# Clear public event attributes
ta.clear_super_properties

# 2.3 User Features

# 2.3.1 user_set

For general user features, you can call user_set to set them. Attributes uploaded using this interface will overwrite the original attribute value (attributes that have not been passed in will not be modified, the same below). If the user feature does not exist before, the new user feature will be created:

# Define user attribute data
user_data = {
	# Account ID (optional)
	account_id: 'ruby_test_aid',
	# Visitor ID (optional), account ID and visitor ID can not be empty
	distinct_id: 'ruby_distinct_id',
	# User attribute
  properties: {
    prop_date: Time.now,
    prop_double: 134.12,
    prop_string: 'hello',
    prop_int: 666,
	},
}

# Set user attribute
ta.user_set(user_data);

# 2.3.2 user_set_once

If the user feature you want to upload only needs to be set once, you can call user_set_once to set it. When the attribute already has a value before, this information will be ignored to ensure that the incoming value is the first received value:

# Set the user attribute. If the user's attribute has a value, the newly set attribute will be ignored
ta.user_set_once(user_data);

# 2.3.3 user_add

When you want to upload a numeric attribute, you can call user_add to accumulate the attribute. If the attribute has not been set, a value of 0 will be assigned before calculation:

# Accumulate attributes of numeric type
ta.user_add(distinct_id: 'ruby_distinct_id', properties: {prop_int: 10, prop_double: 15.88})

# 2.3.4 user_unset

When you need to empty a user's user feature value, you can call user_unset to empty:

# Clear a user attribute of a user
ta.user_unset(distinct_id: 'ruby_distinct_id', property: :prop_string)

# Clear a set of user attributes for a user
ta.user_unset(distinct_id: 'ruby_distinct_id', property: Array.[](:prop_a, :prop_b, :prob_c))

# 2.3.5 user_append

When you want to append a user feature value to an array type, you can call user_appendto append to the specified attribute, if the attribute has not been created in the cluster, then user_appendcreate the attribute.

 #Append the properties of one or more lists of user
 user_data_arr = {
     # Account ID (optional)
     account_id: ACCOUNT_ID,
     # Visitor ID (optional), account ID and visitor ID cannot both be empty
     distinct_id: DISTINCT_ID,
     # user attribute
     properties: {//Upload in the form of key array, and the data in the array will finally toString
         array: ["11", "22"],
     },
 }
 ta.user_append(user_data_arr)

# 2.3.6 user _ del

# delete user
ta.user_del(
	# account ID (Optional)
	account_id: 'ruby_test_aid',
	# visitor ID (Optional),account ID and visitor ID cannot both be empty
	distinct_id: 'ruby_distinct_id',
);

If you want to delete a user, you can call user_del to delete the user. After that, you will no longer be able to query the user feature of the user, but the events generated by the user can still be queried:

# delete ID
ta.user_del(
	# account ID (Optional)
	account_id: 'ruby_test_aid',
	# Visitor ID (optional), account ID and visitor ID cannot both be empty
	distinct_id: 'ruby_distinct_id',
);

# III. Advanced Functions

Starting with v1.2.0, the SDK supports the reporting of two special types of events: updatable events and rewritable events. These two events need to be used in conjunction with TA system 2.8 and later versions. Since special events are only applicable in certain specific scenarios, please use special events to report data with the help of Count Technology's customer success and analysts.

# 3.1 Updatable Events

You can implement the need to modify event data in a specific scenario through updatable events. Updatable events need to specify an ID that identifies the event and pass it in when you create an updatable event object. The TA background will determine the data that needs to be updated based on the event name and event ID.

// Instance: Report an event that can be updated, assuming the event name is event_update
event_name = 'event_update'
event_id = '123'
account_id = '123'
distinct_id: '65478cc0-275a-4aeb-9e6b-861155b5aca7'
prop = {
          price: 100,
          status: 3,
      }
      // Event attributes after reporting are status 3 and price 100
 event_update = {
      # Event name (required) string
      event_name:event_name,
      # Event ID (required) string
      event_id: event_id,
      # Account ID (optional) string
      account_id:account_id,
      # Visitor ID (optional), account ID and visitor ID can not both be empty string
      distinct_id: distinct_id,
      # Event Time (optional) If not filled in, the current time at which the interface was called will be used as the event time
      time: Time.now,
      # Event IP (optional) When an IP address is passed in, the background can resolve the location
      ip: '202.38.64.1',
      # Event attributes (optional)
      properties: prop,
  }
 ta.track_update(event_update)

// Same event after reporting Name + event_ The event attribute status of ID is updated to 5, price remains unchanged
_new_proterties = {
  status: 5,
}
 event_update = {
      event_name:event_name,
      event_id: event_id,
      account_id:account_id,
      distinct_id: distinct_id,
      time: Time.now,
      ip: '202.38.64.1',
      properties: _new_proterties,
  }
ta.track_update(event_update)

# 3.2 Rewritable Events

Rewritable events are similar to updatable events, except that rewritable events will completely cover historical data with the latest data, which is equivalent to deleting the previous data and storing the latest data in effect. The TA background will determine the data that needs to be updated based on the event name and event ID.

// Instance: Report an event that can be rewritten, assuming the event name is OVERWRITE_EVENT
event_name = 'event_overwrite'
event_id = '123'
account_id = '123'
distinct_id: '65478cc0-275a-4aeb-9e6b-861155b5aca7'
prop = {
          price: 100,
          status: 3,
      }
      // Event attributes after reporting are status 3 and price 100
 event_overwrite = {
      # Event name (required) string
      event_name:event_name,
      #Event ID (required) string
      event_id: event_id,
      # Account ID (optional) string
      account_id:account_id,
      # Visitor ID (optional), account ID and visitor ID can not both be empty string
      distinct_id: distinct_id,
      # Event Time (optional) If not filled in, the current time at which the interface was called will be used as the event time
      time: Time.now,
      # Event IP (optional) When an IP address is passed in, the background can resolve the location
      ip: '202.38.64.1',
      # Event attributes (optional)
      properties: prop,
  }
 ta.track_overwrite(event_overwrite)

// Same event after reporting Name + event_ The event attribute status of ID is updated to 5 and the price attribute is deleted

_new_prop = {
  status: 5,
}
 event_overwrite = {
      event_name:event_name,
      event_id: event_id,
      account_id:account_id,
      distinct_id: distinct_id,
      time: Time.now,
      ip: '202.38.64.1',
      properties: _new_prop,
  }
ta.track_overwirte(event_overwrite)

# IV. Other Operations

# 4.1 Data IO Now

This operation is related to the specific Consumerimplementation. When data is received, the Consumercan first store the data in a buffer and trigger a real data IO operation under certain circumstances to improve overall performance. In some cases where you need to submit data immediately, you can call the flush interface:

# Submit data immediately to the appropriate receiver
ta.flush

# 4.2 Shut Down SDK

Please call this interface before exiting the program to avoid data loss in the cache:

# Close and exit SDK
ta.close

# 4.3 Handle Exception

Errors are ignored by default except that the initialization parameters are invalid. If you want to handle Errors in interface calls yourself, you can pass in a custom error handler.

# (Optional) Define an error handler that will be called when an Error occurs
class MyErrorHandler < TDAnalytics::ErrorHandler
  def handle(error)
      puts error
      raise error
  end
end
my_error_handler = MyErrorHandler.new

# Create a TA instance with either Consumer as the first parameter and optional Consumer as the second parameter, which will be called in case of an error if set
ta = TDAnalytics::Tracker.new(consumer, my_error_handler, uuid: true)

If uuid is true, each piece of data will be carried with a random UUID reported as the value of the #uuid attribute. This value will not be stored in the library and will only be used for data duplication detection in the background.

# ChangeLog

# v1.2.0 (2020/08/28)

  • Added track_update interface to support updatable events
  • Added track_overwrite interface to support rewritable events

# v1.1.0 (2020/02/11)

  • Data type support array type
  • Added user_append interface to support attribute appending of user's array type
  • BatchConsumer performance optimization: support to choose whether to compress; remove Base64 encoding
  • DebugConsumer Optimization: More complete and accurate validation of data at the server level

# v1.0.0 (2019-11-20)

  • Support three modes of reporting: DebugConsumer, BatchConsumer, LoggerConsumer.
  • Support event reporting and user feature reporting.
  • Supports public event properties.