TQL - Total Query Language

TQL Uses a Schema based system to query records and select columns relying on a series of suffixes and prefixes to modify the query parameters and establish automatic aggregations.

TQL Arguments

Arguments regarding your query should be passed in by using the column names. This is the "where" part of the API. You can choose from any number of query criteria.

By default each arguments you pass limit the criteria further as each filter creates an "AND" query.

Example: Find Policies Created Today, AND Sold Today

Query String: ?date_created=Today&date_sold=Today JSON { "date_created": "Today", "date_sold": "Today" }

You can create an "OR" Query by passing an array called 'or' with one or more elements in multiple arrays.

Example: Find Policies Created Today OR Sold Today

Query String: ?or[0][date_created]=Today&or[1][date_sold]=Today JSON { "or": [ { "date_created": "Today" }, { "date_sold": "Today" } ] }

Example: Find Policies with ( Date Created Today AND Converted Today ) OR ( Created Today AND Sold Today )

Query String: ?or[0][date_created]=Today&or[0][date_converted]=Today&or[1][date_sold]=Today&or[1][date_converted]=Today JSON { "or": [ { "date_created": "Today", "date_converted": "Today" }, { "date_sold": "Today", "date_created": "Today" } ] }

TQL Columns

Columns should be passed in using the special columnsargument, which can be a comma separated list of columns you want to retrieve, or an array of columns you want to retrieve.

Order Matters! The order in which you pass in the Columns will determine it’s placement when retrieving rows.

Example: Return Lead ID, First Name, Last Name, Phone and Status ID

TQL Suffixes

TQL Suffixes vary based on datatype as certain suffixes make logical sense and others don't. Below is a table showing which datatypes support which suffixes.

Please note these can NOT be used as Column Selectors, these are for Query Criteria Only.

Supported Suffixes

Description

string

text

number

decimal

date

datetime

time

boolean

Supported Suffixes

Description

string

text

number

decimal

date

datetime

time

boolean

no suffix

Column Equals Value

_has

0: Column Value does not Exist or is Null or Empty. 1: Column Value Exists and is not Empty,

_not

Column is NOT Equal To Value

_in

Column is one of Multiple Values. Forces String Searches to Parse Comma as Multiple Values.

_in_not

Column Value is NOT one of Multiple Values

_like

Column Contains Text Value

_like_not

Column Value does NOT Contain Text

_like_right

Column Begins With Value

 

 

 

 

 

 

 

 

_like_right_not

Column Does NOT Begin With Value

 

 

 

 

 

 

 

 

_like_left

Column Ends With Value

 

 

 

 

 

 

 

 

_like_left_not

Column Does NOT End With Value

 

 

 

 

 

 

 

 

_greater

Column Value is Greater Than, but NOT Equal to.

_less

Value is Less Than, but NOT Equal to.

_less_equal

Value is Less Than, or Equal to.

_greater_equal

Value is Greater Than, or Equal to.

_end

Column Value is Between Primary Value and End Value

_end_not

Column Value is NOT Between Primary Value and End Value

_between

Column Value is Between an Array of Two Values ( Shortcut for _end, array only )

_between_not

Column Value is Not Between an Array of Two Values ( Shortcut for _end_not, array only )

Example: Complex Query to find Sales Made Prior to this Month, with a Premium of 100.00 to 200.00 with a Product Name that Starts with “Aetna”

A few assumptions are made here. When using “This Month” which is a relative Date, the less suffix will look at the beginning of the month. When premium_end is present along with premium, it switches form a premium = 100 query to a BETWEEN 100 AND 200. product_name_like_right is actually going to look into the products table to the attached product via product_id on the policies table and do a LIKE ‘Aetna%’ query. TQL is very powerful once you know how the suffixes interact with each other and what they mean. They are very close to SQL native language.

TQL Prefixes

TQL Prefixes are Aggregations they wrap the normal value of a field with an aggregation function. These only show up when using the aggregates: true TQL Parameter. TQL does have native built in aggregations already for some tables such as “total” and “Average” but not all. Prefixes allow for the most common aggregations without having to manually define them in the schema.

These Aggregations can be used as both Column Selectors as well as Query Arguments! They ALSO support TQL Suffixes along with their prefixes, for example, tql_sum_premium_greater: 200 with group_by: lead_id would sum the premium of all policies by lead_id and return results where the total is greater than 200.

Supported Prefixes

Label

Description

string,all types

boolean,number,decimal,currency

date,datetime,timestamp

Supported Prefixes

Label

Description

string,all types

boolean,number,decimal,currency

date,datetime,timestamp

tql_cnt_

Count of

Count of Rows

tql_cnu_

Unique Count of

Distinct Count of Rows

tql_cmp_

Compiled

Comma Concatenated Values

tql_cmu_

Unique Compiled

Distinct Comma Concatenated Values

tql_cml_

Last Compiled

Last Concatenated Value

tql_cmf_

First Compiled

First Concatenated Value

tql_min_

Minimum of

Minimum Value

tql_max_

Maximum of

Maximum Value

tql_sum_

Sum of

Sum of Values

tql_avg_

Average of

Average of Values

tql_spy_

Year Span of

Difference in Years between Min / Max

tql_spq_

Quarter Span of

Difference in Quarters between Min / Max

tql_spm_

Month Span of

Difference in Months between Min / Max

tql_spw_

Week Span of

Difference in Weeks between Min / Max

tql_spd_

Day Span of

Difference in Days between Min / Max

tql_spi_

Minute Span of

Difference in Minutes between Min / Max

tql_sps_

Seconds Span of

Difference in Seconds between Min / Max

Example: I want to Sum the Premiums for all Sales this Month and get the Total Leads and Total Policies Sold.

And here is the results we wanted!

Column Renaming

Column renaming doesn’t work that well with Query String parameters as you cannot pass this data in via a comma separated list, however with an Query Array or a JSON Array of Objects you can see in the following example it is possible to change the name of the Column when it is being output. Be warned if you name the column the same as an existing column your renamed column will take priority if it is listed after any standard columns of the same name.

As you can see the results changed their output keys! This is useful for feeding information into difficult to deal with systems where parameter renaming isn’t possible

Column Resolving

Another lesser known trick is column resolution. As mentioned earlier, importing a table will automatically prefix the columns with the provided prefix; The resolve parameter can attempt to automatically fix that. You can pass an array or a comma separated list of prefixes to attempt to resolve which will remove the prefix from the Query Parameters as well as the Column Names. The most common in this case is getting vendor_id instead of lead_vendor_id from the policies table. You can also opt to use column renaming instead for specific columns. Column Resolving, unlike Renaming, will not allow conflicts with the base table since it’s a sweeping schema wide modification and to preserve data integrity.

 

As you can see we now have Vendor ID, First Name and Last Name all as if we were looking at the Leads Table.

Special Parameters

Parameter

Values

Description

Notes

Parameter

Values

Description

Notes

import

string|array|none

This is a list of “tags” related to relationships on base columns that will import the related table and auto-prefix columns which makes them available both for query selection and columns selection

Default Imports is Based on Endpoint. Be careful with imports as using generic names imports can further import other tables joined to import tables which you may not want or need. If you don’t want imports to automatically populate set order_by: none

limit

integer|'0'

The number of rows to return. This is used in conjunction with page.

Default 10, Max 100,000.

Use limit: 0 to return ALL results. Make sure to use format: stream or format: csv when using limit 0 to prevent out of memory errors. Make sure to pass '0' as a string not a number in this particular case.

page

integer

The page, based off the limit, of which to show results.

1

columns

string|array

A Comma Separated List of Column Names or an Array of Columns to return. Columns Will return in the order provided.

Default Columns are Based on Endpoint

single

true|false

This will attempt to return just 1 item as an object instead of an array of objects.

false

collection

true|false

This will always return results as an array of objects including if you are querying for a primary key.

 

order

array

An array of key names related to schema column names with values of DESC or ASC for multi-sorting. If this is present it will override order_by and sort. Example: order: { premium: DESC, date_sold: DESC }

Default Order is Based on Endpoint

order_by

string

Alternative to Order, but supports only one ordering column.

Default Order By is Based on Endpoint.

For large or complex queries it is recommended to set order_by:none to improve query performance.

sort

ASC|DESC

Alternative to Order, but supports only one ordering columns.

Default Sort is Based on Endpoint

group_by

string|array

Grouping is sort of like making a Pivot Table. You group_by one or more columns and it "squashes" down the results if one or more aggregate columns are set. Common methods are trying to find total leads per vendor, you would group_by vendor_id and then select the vendor_id column and aggregates such as vendor_id,tql_cnt_lead_id

Default Group By is based on Schema. Often is set to the primary key. If you are having query trouble and you know your query does not have aggregates, it is recommended to provide group_by: false to improve query performance for large or complex queries that have no aggregates.

aggregate

true|false

True or False ( 1 / 0 ). Whether or not to remove Default Grouping from the Query.

false

aggregates

true|false

True or False( 1 / 0 ). Wether or not to include tql_ prefixed aggregate query flags. Currently this is not enabled by default so that we can keep the schema sizes small based on imports. This may change in the future and not be needed.

 

resolve

string|array|true

Resolve a prefix, set of prefixes, or all prefixes where possible and remove them. This changes the Query Argument Names and the Column Names of those parameters.

false, setting true will attempt to resolve all join tables.

format

csv|json|stream

Not specifying a format will return a JSON object in the results array with debugging information. This is problematic with large datasets. using csv, json, or stream will stream data to the browser allowing you to access much larger datasets.

Be careful with pulling more than 1 million records as a time as excel itself will only be able to open about 1 million rows.