目录
此内容是否有帮助?

# Java SDK User Guide

TIP

Before accessing, please read the preparation before accessing .

You can get Java SDK source code on GitHub (opens new window).

**The latest version is **: 1.9.1 . 1

**Update time is **: 2021- 12 - 20

# I. Integrate and Initialize SDK

  1. With the Maven integration SDK, place the following dependency information in the pom.xmlfile (recommended):
<dependencies>
    // others...
    <dependency>
        <groupId>cn.thinkingdata</groupId>
        <artifactId>thinkingdatasdk</artifactId>
        <version>1.9.1</version>
    </dependency>
</dependencies>
  1. Import the project using the jar package.

Java SDK JAR download address (opens new window)

  1. Initialize the SDK.
//Initialise SDK in two ways, Consumer includes(LoggerConsumer,BatchConsumer,DebugConsumer
//defult
ta = ThinkingDataAnalytics(Consumer);
//Add UUID de duplication
bool enableUuid = true;
ta = ThinkingDataAnalytics(Consumer,enableUuid);

You can get an SDK instance in three ways (it is recommended to initialize it once):

**(1) LoggerConsumer: **write local files in batches in real time. The files are separated by days and need to be uploaded with LogBus.

//Using loggerconsumer, files are segmented by day by default
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.LoggerConsumer(LOG_DIRECTORY));

If you want to split the file by the hour, you can initialize the code as follows:

//Configuration class of LoggerConsumer
ThinkingDataAnalytics.LoggerConsumer.Config config =new  ThinkingDataAnalytics.LoggerConsumer.Config(LOG_DIRECTORY);
//Configure segmentation by hour. The default is rotatemode Daily segmentation by day
config.setRotateMode(ThinkingDataAnalytics.LoggerConsumer.RotateMode.HOURLY);
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.LoggerConsumer(config));

If you want to split by size, you can initialize the code as follows:

//Configuration class of loggerconsumer
ThinkingDataAnalytics.LoggerConsumer.Config config =new  ThinkingDataAnalytics.LoggerConsumer.Config(LOG_DIRECTORY);
//Set file segmentation by size on the premise of daily segmentation. The unit is m. for example, set 2G file segmentation
config.setFileSize(2*1024);
//Set generated file prefix
config.setFilenamePrefix("prefix");
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.LoggerConsumer(config));

LOG_DIRECTORYTo write to the local folder address, you only need to set the listening folder address of LogBus to the address here, and you can use LogBus to monitor and upload data.

**(2) BatchConsumer: **batch real-time transmission of data to the TA server, do not need to match the transmission tool, due to network problems will retry 3 times when the transmission fails, still fails will store the data in the buffer, the buffer size can be set, the default is 50, that is, the total number of data retained in the buffer is up to 50 * 20 (20 is the batch value of each upload, can be set). In the case of prolonged network outages, there is a risk of data loss.

//Using the configuration class of batchconsumer
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.BatchConsumer(SERVER_URI, APP_ID));

If you want to set the data compression method, you can initialize the code as follows:

//BatchConsumer
ThinkingDataAnalytics.BatchConsumer.Config batchConfig = new ThinkingDataAnalytics.BatchConsumer.Config();
//The configurable compression methods are gzip, lzo, lz4 and none. The default compression method is gzip, and none can be used in the intranet
batchConfig.setCompress("gzip");
//Configurable connection timeout, unit: ms
batchConfig.setTimeout(10000);//10s
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.BatchConsumer(SERVER_URI, APP_ID,batchConfig));

If you want to set the number of data flush, you can initialize the code as follows:

//BatchConsumer
ThinkingDataAnalytics.BatchConsumer.Config batchConfig = new ThinkingDataAnalytics.BatchConsumer.Config();
//The default number of active data flush to ta system is 20, which can be configured as follows
 batchConfig.setBatchSize(30);//The number of flushes is 20 by default and the upper limit is 7000 by default
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.BatchConsumer(SERVER_URI, APP_ID,batchConfig));

SERVER_URIThe uri for transferring data, APP_IDthe APP ID for your project

If you are using Cloud as a Service, enter the following URL:

http://receiver.ta.thinkingdata.cn

If you are using the version of private deployment, enter the following URL:

http://Data acquisition address

Note: Enter the following URL before version 1.4.0:

http://receiver.ta.thinkingdata.cn/logagent
http://Data acquisition address/logagent

**(3) DebugConsumer: **One by one real-time transmission of data to the TA server, do not need to match the transmission tool, if the data error, the entire data will not be stored, and return a detailed error description, Not recommended for use in the official environment**.**

ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.DebugConsumer(SERVER_URI, APP_ID));

If you do not want the data to be stored, but only want to verify the data format, you can initialize the code as follows:

//The default is true
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.DebugConsumer(SERVER_URI, APP_ID,false));

SERVER_URIFor the uri that transmits data, APP_IDFor the APP ID of your project

If you are using Cloud as a Service, enter the following URL:

http://receiver.ta.thinkingdata.cn

If you are using the version of private deployment, enter the following URL:

http://Data acquisition address

# II. Send Data

After the SDK initialization is completed, you can call trackto upload events. In general, you may need to upload more than a dozen to hundreds of different events. If you are using the TA background for the first time, we recommend You upload a few key events first.

If you have doubts about what kind of events you need to send, you can check the Quick Use Guide for more information.

# 2.1 Send Events

You can call trackto upload events. It is recommended that you set the attributes of the event and the conditions for sending information according to the previously combed doc. Here, take user payment as an example:

//Initialise SDK
ThinkingDataAnalytics ta = new ThinkingDataAnalytics(new ThinkingDataAnalytics.BatchConsumer(SERVER_URI, APP_ID));

//Set Visitor ID "ABCDEFG123456789"
String distinct_id = "ABCDEFG123456789";

//Set account ID "TA_10001"
String account_id = "TA_10001";

//Set Event Properties
Map<String,Object> properties = new HashMap<String,Object>();

// Set the time when the event occurred, if not, the default is the current time
properties.put("#time", new Date());

// Setting the user's IP address, the TA system will interpret the user's geographic location information based on the IP address, if not, it will default to not report
properties.put("#ip", "192.168.1.1");

properties.put("Product_Name", "Product A");
properties.put("Price", 30);
properties.put("OrderId", "order number abc_123");

//
Upload event, including user's visitor ID and account ID, please be careful not to reverse the visitor ID and account ID
try {
            ta.track(account_id,distinct_id,"payment",properties);
            ta.flush();
        } catch (Exception e) {
		   //exception handling
            System.out.println("except:"+e);
     }

**Note: **In order to ensure that the guest ID and account ID can be bound smoothly, if you will use the guest ID and account ID in your game, we strongly recommend that you upload the two IDs at the same time, otherwise the account will not match, resulting in repeated user calculations. For specific ID binding rules, please refer to the chapter User Identification Rules .

  • The name of the event is of type String, can only begin with a letter, can contain numbers, letters and an underscore "_", is up to 50 characters in length, and is not case sensitive to letters.
  • The property of the event is a Map < String, Object >object, where each element represents a property.
  • The value of Key is the name of the attribute, which is of Stringtype. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to case of letters.
  • Value is the value of the property, supporting String, Number, Booleanand Date, List.

# 2.2 Set Common Event Properties

For some properties that need to appear in all events, you can call setSuperPropertiesto set these properties as public event properties, which will be added to all events uploaded using track.

Map<String, Object> superProperties = new HashMap<String,Object>();
//Set Common Properties: Server Name
superProperties.put("server_name", "S10001");
//Set Common Properties: Server Version
superProperties.put("server_version", "1.2.3");
//Set Common Event Properties
ta.setSuperProperties(superProperties);

Map<String,Object> properties = new HashMap<String,Object>();
//Set Event Properties
properties.put("Product_Name", "product A");
properties.put("Price", 30);
//Upload an event with the common attributes and the attributes of the event
try {
            ta.track(account_id,distinct_id,"payment",properties);
        } catch (Exception e) {
		   //exception handling
            System.out.println("except:"+e);
     }

/** Equivalent to adding these attributes to each event
 *  properties.clear();
 *  properties.put("server_name", "S10001");
 *  properties.put("server_version", "1.2.3");
 *  properties.put("Product_Name", "商品A");
 *  properties.put("Price", 30);
 *try {
 *        ta.track(account_id,distinct_id,"payment",properties);
 *       } catch (Exception e) {
 *		   //exception handling
 *          System.out.println("except:"+e);
 *   }
 */
  • The public event property is also a Map < String, Object >object, where each element represents a property.
  • The value of Key is the name of the attribute, which is of Stringtype. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to case of letters.
  • Value is the value of the property, supporting String, Number, Booleanand Date, List.

If you call setSuperPropertiesto set a previously set public event property, the previous property value is overwritten. If the public event property and the Key of a property in the trackupload event duplicate, the property of the event overrides the public event property:

Map<String, Object> superProperties = new HashMap<String,Object>();
superProperties.put("server_name", "S10001");
superProperties.put("server_version", "1.2.3");
//Set Common Event Properties
ta.setSuperProperties(superProperties);


superProperties.clear();
superProperties.put("server_name", "Q12345");
//Set the public event properties again, when "server_name" is overwritten and the value is "Q12345"
ta.setSuperProperties(superProperties);

Map<String,Object> properties = new HashMap<String,Object>();
properties.put("Product_Name", "product A");
//Set properties that repeat the attributes of a public event
superProperties.put("server_version", "1.2.4");
//Upload event, at this time the attribute value of "server_version" will be overwritten with "1.2.4" and the value of "server_name" will be "Q12345".

try {
           ta.track(account_id,distinct_id,"payment",properties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

If you want to empty all public event properties, you can call clearSuperProperties.

# III. User Attributes

TA platform currently supports the user feature interface to user_set, user_setOnce, user_add, user_del.

# 3.1 User_set

For general user feature, you can call user_setto set it. The attributes uploaded using this interface will overwrite the original attribute values. If the user feature does not exist before, the new user feature will be created. The type is the same as the type of the passed-in attribute. Take setting the user name as an example:

Map<String,Object> userSetProperties = new HashMap<String,Object>();
userSetProperties.put("user_name", "ABC");
userSetProperties.put("#time",new Date());

//Upload User Properties
try {
          ta.user_set(account_id,distinct_id,userSetProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }
userSetProperties.clear();
userSetProperties.put("user_name","abc");
userSetProperties.put("#time",new Date());

//Upload the user property again, and the value of "user_name" will be overwritten with "abc"
try {
         ta.user_set(account_id,distinct_id,userSetProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

user_setThe set user feature is a Map < String, Object >object, where each element represents an attribute.

The value of Key is the name of the attribute, which is of Stringtype. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to case of letters.

Value is the value of the property, supporting String, Number, Booleanand Date, List.

# 3.2 User_setOnce

If the user feature you want to upload only needs to be set once, you can call user_setOnceto set it. When the attribute already has a value before, this information will be ignored. Take setting the user name as an example:

Map<String,Object> userSetOnceProperties = new HashMap<String,Object>();
userSetOnceProperties.put("user_name", "ABC");
userSetOnceProperties.put("#time",new Date());

//Upload user properties and create a new "user_name" with the value "ABC"
try {
        ta.user_setOnce(account_id,distinct_id,userSetOnceProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

userSetOnceProperties.clear();
userSetOnceProperties.put("user_name","abc");
userSetOnceProperties.put("user_age",18);
userSetOnceProperties.put("#time",new Date());

//Upload the user property again, at which point the value of "user_name" will not be overwritten and will still be "ABC"; The value of "user_age" is 18
try {
        ta.user_setOnce(account_id,distinct_id,userSetOnceProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

user_setOnceThe user feature types and restrictions set are consistent with user_set.

# 3.3 User_add

When you want to upload a numeric attribute, you can call user_addto accumulate the attribute. If the attribute has not been set, a value of 0 will be assigned before calculation. Negative values can be passed in, which is equivalent to subtraction operations. Here is an example of the accumulated payment amount:

Map<String,Object> userAddProperties = new HashMap<String,Object>();
userAddProperties.put("total_revenue",30);
userAddProperties.put("#time",new Date());

//Upload user attributes with a value of 30 for "total_revenue"
try {
        ta.user_add(account_id,distinct_id,userAddProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

userAddProperties.clear();
userAddProperties.put("total_revenue",60);
userAddProperties.put("#time",new Date());

//Upload user attributes again, when the value of "total_revenue" accumulates to 90
try {
        ta.user_add(account_id,distinct_id,userAddProperties);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

user_addThe user feature type and restrictions set are the same as user_set, but only incoming numeric user features are supported.

# 3.4 User_append

When you want to append the user feature value to the list type, you can call user_appendto append the specified attribute, if the attribute has not been created in the cluster, then user_appendcreate the attribute.

        List<String> appendList1 = new ArrayList<>();
        appendList1.add("12.2");
        appendList1.add("str");
        properties.put("arr1", appendList1);//Additional Attributes for Array Type of Arr1
        List<String> appendList2 = new ArrayList<>();
        appendList2.add("2");
        appendList2.add("true");
		properties.put("arr2", appendList2);//Additional Attributes for Array Type of arr2
		try {
            tga.user_append(account_id, distinct_id, properties);
        } catch (Exception e) {
            //exception handling
            System.out.println("except:"+e);
        }

# 3.5 User_unset

When you want to empty the user feature value of the user, you can call user_unsetto empty the specified attribute. If the attribute has not been created in the cluster, the attribute will not be created user_unset.

// Reset individual user attributes
try {
        ta.user_unset("key1");
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

// Reset multiple user attributes
try {
      ta.user_unset("key1", "key2", "key3");
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

// Reset multiple user attributes, pass in string array
String[] keys = {"key1", "key2"};
try {
        ta.user_unset(keys);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

# 3.6 User _ del

If you want to delete a user, you can call user_delto delete this user. You will no longer be able to query the user features of this user, but the events generated by this user can still be queried. This operation may have irreversible consequences. Please use with caution.

try {
        ta.user_del(account_id,distinct_id);
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

# IV. Other Operations

# 4.1 Submit Data Immediately

ta.flush();

Submit data immediately to the appropriate receiver.

# 4.2 Close sdk

try {
       ta.close();
      } catch (Exception e) {
		   //exception handling
           System.out.println("except:"+e);
     }

Close and exit sdk, please call this interface before shutting down the server to avoid data loss in the cache.

# V. Relevant Preset Attributes

# 5.1 Preset Properties for All Events

The following preset attributes are those that will be associated with all events in the Java SDK, including auto-capture events.

Attribute name
Chinese name
Description
#ip
IP address
The user's IP address needs to be manually set, and TA will use this to obtain the user's geographic location information
#country
Country
User's country, generated according to IP address
#country_code
Country code
The country code of the country where the user is located (ISO 3166-1 alpha-2, that is, two uppercase English letters), generated based on the IP address
#province
Province
The user's province is generated according to the IP address
#city
City
The user's city is generated according to the IP address
# lib
SDK Type
The type of SDK you access, such as Java, etc
#lib_version
SDK version
The version you accessed Java SDK

# VI. Advanced Functions

Starting with v1.6.0, the SDK supports the reporting of two special types of events: updatable events and rewritable events. These two events need to be used in conjunction with TA system 2.8 and later versions. Since special events are only applicable in certain specific scenarios, please use special events to report data with the help of Count Technology's customer success and analysts.

# 6.1 Updatable Events

You can implement the need to modify event data in a specific scenario through updatable events. Updatable events need to specify an ID that identifies the event and pass it in when you create an updatable event object. The TA background will determine the data that needs to be updated based on the event name and event ID.

// instance: Report an event that can be updated, assuming the event name is UPDATABLE_ EVENT
Map<String, Object> properties = new HashMap<>();
properties.put("price",100);
properties.put("status",3);
// Event attributes after reporting are status 3 and price 100
ta.track_update(account_id, distinct_id,"UPDATABLE_EVENT","test_event_id",properties);

// Same test_after reporting Event_ ID + UPDATABLE_ Event status for EVENT is updated to 5 with price unchanged
Map<String, Object> protertiesNew = new HashMap<>();
protertiesNew.put("status",5);
ta.track_update(account_id, distinct_id, "UPDATABLE_EVENT", "test_event_id", protertiesNew);

# 6.2 Rewritable Events

Rewritable events are similar to updatable events, except that rewritable events will completely cover historical data with the latest data, which is equivalent to deleting the previous data and storing the latest data in effect. The TA background will determine the data that needs to be updated based on the event name and event ID.

// Instance: Report an event that can be overridden, assuming the event name is OVERWRITE_ EVENT
Map<String, Object> properties = new HashMap<>();
properties.put("price",100);
properties.put("status",3);
// Event attributes after reporting are status 3 and price 100
ta.track_overwrite(account_id, distinct_id, "OVERWRITE_EVENT","test_event_id", properties);

Map<String, Object> protertiesNew = new HashMap<>();
protertiesNew.put("status",5);

// Event attributes status is updated to 5 and price attributes are deleted after reporting
ta.track_overwrite(account_id, distinct_id, "OVERWRITE_EVENT", "test_event_id", protertiesNew);

# 6.3 First Event Check Function

Using the "first event check" feature, you need to set the #first_check_id field in properties, the type is string, this field is to check the ID of the first event, the data that appears in the first article of the ID will be stored, and the data that appears later cannot be stored. The #first_check_id of different events are independent of each other, so the first verification of each event does not interfere with each other.

// Instance: Report an event that can be overridden, assuming the event name is OVERWRITE_ EVENT
Map<String, Object> properties = new HashMap<>();
properties.put("price",100);
properties.put("status",3);
properties.put("#first_check_id","123456");
// Report Event Attributes
ta.track(account_id, distinct_id, "EVENT","test_event_id", properties);

# 6.4 Timing Refresh Function

By default, the JAVA SDK will only automatically flush () according to the current data size. By default, flush () will be called once when it exceeds 8K. The v1.7.0 version adds a timing refresh function. BatchConsumer and LoggerConsumer can configure Config The interval and autoFlush parameters turn on the timing refresh function.

ThinkingDataAnalytics.BatchConsumer.Config config = new ThinkingDataAnalytics.BatchConsumer.Config();
config.setAutoFlush(true);
//interval Timing time in seconds, default value 3
config.setInterval(5);
ThinkingDataAnalytics tga = new ThinkingDataAnalytics(new ThinkingDataAnalytics.BatchConsumer("http://localhost:8991", "APP_ID", config));

# ChangeLog

# v1.9.1 2021/12/20

  • Fix the problem that the connection of the connection pool is closed abnormally

# v1.9.0 2021/12/03

  • Added support for uploading object and object list data

# v1.8.2 2021/08/31

  • Fix a link abnormally closed bug

# v1.8.1 2021/06/17

  • Update http client version
  • Add exception log

# v1.8.0 2021/03/22

  • Add a cache pool to store data sent for network reasons and try again when sending it later
  • LoggerConsumer adds support for #app_id attribute

# v1.7.0 2020/12/08

  • Added timing flush function
  • Add LoggerConsumer automatic directory creation function

# v1.6.0 2020/08/24

  • Added track_update interface to support updatable events
  • Added track_overwrite interface to support rewritable events
  • Support LoggerConsumer prefix name configuration

# v1.5.3 2020/07/08

  • Fix v1.5.2 cannot find fastjson dependency problem

# V 1.5.2 2020/07/07 (discarded)

  • Fastjson version number 1.2.44 changed to 1.2.71, fixing a deserialization vulnerability when users use this version number
  • Fix BatchConsumer may leak memory

# v1.5.1 2020/04/21

  • Remove #time type interception

# v1.5.0 2020/02/10

  • Data type Support list type
  • Added user_append interface to support attribute appending of user's array type

# v1.4.0 2020/01/06

  • Added user_unset interface to delete user features
  • BatchConsumer performance optimization: support for configuring compression mode; remove Base64 encoding
  • DebugConsumer Optimization: More complete and accurate validation of data at the server level

# v1.3.1 2019/09/27

  • Remove the 1G upper limit of the default file size of LooggerConsumer, users can configure their own segmentation by day, hour, and size

# v1.3.0 2019/09/26

  • Remove ProduceKafka and avoid excessive dependency

# v1.2.0 2019/09/12

  • Added DebugConsumer for easy debugging interface
  • Optimized LoggerConsumer to support log file segmentation by hour
  • Optimize code and improve stability

# v1.1.17 2019/08/23

  • Optimize abnormal print prompt when reporting abnormal data
  • BatchConsumer request exception return code alert

# v1.1.16 2019/05/30

  • Solve the bug that does not flush data when LoggerConsumer is multithreaded
  • Solve the data duplication situation under BatchConsumer multi-threading

# v1.1.15 2019/04/28

  • Fix Java 1.7 compatibility bug
  • LoggerConsumer does not write to disk at intervals

# v1.1.14 2019/04/25

  • Compatible with Java 1.7
  • Optimized the reporting mechanism of loggerConsumer

# v1.1.13 2019/04/11

  • Optimize the performance and stability of data reporting
  • Adjusted the default parameters of the Consumer

# v1.1.9 2018/09/03

  • BatchConsumer adds asynchronous transfer function, see API doc for details