目录
此内容是否有帮助?

# Node.js SDK User Guide

This guide will show you how to access your project using the N ode.js SDK. You can access the source code of the N ode.js SDK at GitHub (opens new window).

**The latest version is **: 1.2.2

**Update time is **: 2021-10-14

# I. Integrate SDK

# 1.1 Install SDK

Please use npmto get the N ode.js SDK:

# acquire SDK
npm install thinkingdata-node --save

# update SDK
npm i thinkingdata-node@{version}

# 1.2 Create an SDK Instance

First introduce thinkingdata-nodeat the beginning of the code file:

var ThinkingAnalytics = require("thinkingdata-node");

In order to upload data using the SDK, you need to first create an SDK instance. We provide three initialization methods for SDK instances:

// Debug Mode: Send one by one and return detailed error messages
var ta = ThinkingAnalytics.initWithDebugMode("APP-ID", "https://SERVER_URL");

//If you want to just validate the data format and not actually load it into the library, you can config it in:
//config = {
//    dryRun: true
//};
//var ta = ThinkingAnalytics.initWithDebugMode('APP-ID', 'https://SERVER_URL',config);
// Batch Mode:Batch sent to receiver
var ta = ThinkingAnalytics.initWithBatchMode("APP-ID", "https://SERVER_URL");

//Initial sending can also be configured:
//config = {
//    batchSize: 2,//Set flush data
//    compress :false//The default true represents gzip compression and false is suitable for intranet settings
//}
//var ta = ThinkingAnalytics.initWithBatchMode('APP-ID', 'https://SERVER_URL',config);
//
// Logging Mode: Writing data to a local log file requires use with LogBus
var ta = ThinkingAnalytics.initWithLoggingMode("/PATH/TO/DATA");

The three models are described below:

**(1) Debug Mode **: Transmit data to the TA server in real time one by one, and return detailed error messages when the data format is wrong. It is recommended to use DebugConsumer to verify the data format first. Initializes the incoming project APP ID and receiver address.

**(2) Batch Mode **: batch real-time transmission of data to the TA server, no need to match the transmission tool. It may cause data loss under poor network conditions, so it is not recommended to use it extensively in production environments. Initialize the incoming project APP ID and the receiving end address.

Batch Mode will first store the data in the buffer, and when the number of data pieces exceeds the set value (batchSize, default is 20), it will trigger the report. You can also specify the incoming configuration when initializing the SDK:

// Initialize SDK, specify receiver address, APP ID, buffer size
var ta = ThinkingAnalytics.initWithBatchMode("APP-ID", "https://SERVER_URL", {
  batchSize: 10, // Triggers reporting when 10 cached data arrives
  enableLog: true // Allows printing to send data. Detailed sending logs are printed when the logs are opened
});

**(3) Logging Mode **: Write data to local files in real time. The files are split in days/hours, and need to be used with LogBus for data upload. It is recommended to use it in a production environment.

Logging Mode uses log4js to save the data as a local log file in real time. Later, you need to cooperate with LogBus to import the log file into the TA database:

// Log Data Storage Catalog
var logPath = "/home/app/logs/";

// Initialise SDK instance
var ta = ThinkingAnalytics.initWithLoggingMode(logPath);

The log file is split by day by default. You can pass in the configuration object when initializing the SDK to change the log segmentation mode to split by hour:

// Initialize SDK and split log files by hour.
var ta = ThinkingAnalytics.initWithLoggingMode(logPath, { rotateHourly: true });

TIP

If you are using pm2 to manage your N ode.js application, you need to set the startup parameters according to the following steps:

  1. Install pm2-intercom:
pm2 install pm2-intercom
  1. Incoming pm2 parameter when initializing SDK:
// Initialise SDK, split log files by hour, specify PM2 mode
var ta = ThinkingAnalytics.initWithLoggingMode(logPath, {
  rotateHourly: true,
  pm2: true
});

WARNING

If you specify a instance_var in the pm2 configuration, you need to pass the parameter in the initialization configuration. Assuming that instance_var is specified as INSTANCE_ID in the pm2 configuration, the following parameters should be passed when initializing the SDK:

// Initialize SDK, split log files by hour, specify PM2 mode, specify pm2InstanceVar
var ta = ThinkingAnalytics.initWithLoggingMode(logPath, {
  rotateHourly: true,
  pm2: true,
  pm2InstanceVar: "INSTANCE_ID"
});

# II. Report Data

After the SDK initialization is completed, you can use TA's interface to report data later.

# 2.1 Send Events

You can call track to upload events. It is recommended that you set the attributes of the event and the conditions for sending information according to the pre-combed doc. An example of uploading an event is as follows:

// Define event data
var event = {
  // Account ID (optional)
  accountId: "node_test",
  // Visitor ID (optional), account ID and visitor ID can not be empty
  distinctId: "node_distinct_id",
  // Event name (required)
  event: "test_event",
  // Event time (optional) If not filled in, the time at which the interface was called will be used as the event time
  time: new Date(),
  // Event IP (optional) When an IP address is passed in, the background can resolve the location
  ip: "202.38.64.1",
  // Event attributes (optional)
  properties: {
    prop_date: new Date(),
    prop_double: 134.1,
    prop_string: "hello world",
    prop_int: 67
  },
  //Callback in case of error (optional)
  callback(e) {
    if (e) {
      console.log(e);
    }
  }
};

// Upload Event
ta.track(event);

Parameter description:

  • The name of the event can only start with a letter and can contain numbers, letters and an underscore "_". The maximum length is 50 characters and is not sensitive to letter case.
  • The attributes of the event are of type map, where each element represents an attribute.
  • The key value of the event property is the name of the property, which is of string type. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to case of letters.
  • The value of the event property is the value of the property, supporting string, numeric type, bool, Date, array type.

The SDK will check the data format locally. If you want to skip the local check, you can pass the skipLocalCheckparameter when calling the track interface:

// Skip local data checksum
ta.track(event, true);

# 2.2 Set Public Event Properties

The public event property is the property that each event will contain. The normal public property is the fixed value. You can also set the dynamic public property, which will take the value when the event is reported and added to the event. If you have the same property, the dynamic public property overrides the public event property.

// Set Dynamic Common Properties
ta.setDynamicSuperProperties(() => {
  var date = new Date();
  date.setYear(2018);
  return {
    super_date: date,
    super_int: 5
  };
});

// Set  Common Event Properties
ta.setSuperProperties({
  super_int: 8, // It will not appear in the final report because it will be overwritten by dynamic public attributes.
  super_debug_string: "hahahaha"
});

// Clear public event attributes
ta.clearSuperProperties();

# III. Set User Attributes

# 3.1 UserSet

For general user features, you can call the userSetto set them. Properties uploaded using this interface will overwrite the original property value. If the user feature does not exist before, the user feature will be created:

// User Attribute Data
var userData = {
  // Account ID (optional)
  accountId: "node_test",
  // Visitor ID (optional), account ID and visitor ID can not be empty
  distinctId: "node_distinct_id",
  // user attribute
  properties: {
    prop_date: new Date(),
    prop_double: 134.12,
    prop_string: "hello",
    prop_int: 666
  },
  // Callback in case of error (optional)
  callback(e) {
    if (e) {
      console.log(e);
    }
  }
};

// Set User Properties
ta.userSet(userData);

# 3.2 UserSetOnce

If the user feature you want to upload only needs to be set once, you can call userSetOnce to set it. When the attribute already has a value before, this information will be ignored:

ta.userSetOnce(userData);

# 3.3 UserAdd

When you want to upload a numeric attribute, you can call userAdd to accumulate the attribute. If the attribute has not been set, a value of 0 will be assigned before calculation:

// Accumulate User Properties
ta.userAdd({
  accountId: "node_test",
  properties: {
    Amount: 222
  }
});

# 3.4 UserDel

If you want to delete a user, you can call userDel to delete the user. After that, you will no longer be able to query the user features of the user, but the events generated by the user can still be queried:

// delete user
ta.userDel({
  // account ID (optional)
  accountId: "node_test",
  //  visitor ID (optional),account ID and visitor ID cannot be empty
  distinctId: "node_distinct_id"
});

# 3.5 UserAppend

When you want to append a user feature value to an array type, you can call user_appendto append to the specified attribute, if the attribute has not been created in the cluster, then user_appendcreates the attribute.

// Attribute key - array is appended to the user list type, array is a string array
ta.userAppend({
  accountId: "node_test",
  properties: {
    prop_array: ["str3", "str4"]
  }
});

# 3.6 UserUnset

When you want to empty the user feature value of the user, you can call user_unsetto empty the specified attribute. If the attribute has not been created in the cluster, the user_unsetattribute will not be created

//Delete a user property
ta.userUnset({
  accountId: "node_test",
  property: "prop_double"
});

# IV. Other Operations

# 4.1 Data IO Now

This operation is related to the specific Consumer implementation. When receiving data, the Consumer can first store the data in a buffer and trigger a real data IO operation under certain circumstances to improve overall performance. In some cases where you need to submit data immediately, you can call the Flush interface:

// Submit data immediately to the appropriate receiver
ta.flush();

# 4.2 Shut Down SDK

Please call this interface before exiting the program to avoid data loss in the cache:

// Close and exit SDK
ta.close();

# 4.3 Open Log

Set environment variables NODE_DEBUG = tdato open the log.

# ChangeLog

v1.2.2 (2021-10-14)

  • Can upload preset properties
  • Remove the field name length limit

**v1.2.1 **(2021-08-31)

  • LoggingConsumer optimization: support for specifying file prefixes

v1.2.0 (2020-10-16)

  • Support first incident reporting
  • Supports updatable and rewritable event reporting

v1.1.2 (2020-09-08)

  • Solve the problem that only incoming distinctId does not take effect

v1.1.1 (2020-04-14)

  • Fix the bug that the reported address does not support the port number

v1.1.0 (2020-02-12)

  • Support reporting array types
  • Added user_append interface to support attribute appending of user's array type
  • DebugConsumer Optimization: More complete and accurate validation of data at the server level
  • BatchConsumer performance optimization: support for configuring compression mode; remove Base64 encoding

v1.0.1 (2019-11-07)

  • Support three modes of reporting: Debug, Batch, Logging.
  • Support event reporting and user feature reporting.
  • Support public event properties and dynamic public event properties.