In this blog, we will see the various security levels available for querying tables using custom columns or custom filter values in Helical Insight adhoc reports module.

Helical Insight has various placeholders where you can use SQL queries. At the metadata level there is option of writing SQLQuery and saving as a view.At the adhoc report level, you can use queries in custom columns, filter expressions, having clause and filter custom conditions and values. At the dashboard level, using the console, you can access the filter values and modify them.


In version 4.0 GA onwards, changes have been implemented to overcome any issues that are related to security and SQL injection which are possible at adhoc report level and dashboard level. It is highly recommended that end clients migrate to this latest version in order avoid security issues. This blog explains the various security rules and their usage.There are 4 types of security that can be defined in Helical Insight for use of columns/tables while defining custom columns/custom filter values:.

  1. default: This level of security is as good as no-security. This is the security level that is available in versions older than 4.0 GA. This level of security has been exposed so that during migration from older versions, you can temporarily use “default” rule so that your older reports (where you might have used customcolumn filter expression etc) will continue to work and then start making changes in the report so that they are compatible to “strict-metadata” or other even higher security settings which we provide.
  2. strict-metadata: In Helical Insight, the metadata consists of the tables and columns that are to be used to create reports. “strict-metadata” level of security implies that database functions and sql queries can be used in custom columns, filter values, filter expressions and having clause only using the tables and columns that are part of the metadata. One thing which is different from the older version to this newer version is you are referencing any column in any of the places (like custom column, filter expression, filter custom etc) it has to be a fully qualified i.e. table name column name (catalog name schema name also if there are multiple catalogs and schemas). If tables / columns other than those that are part of the metadata are tried to be referenced, an error will be thrown.
  3. strict: This level of security restricts the use of custom sql placeholders alltogether. If any custom column, custom filter value/condition, custom filter expression/having clause is used, an error will be thrown.
  4. stripped: This level of security ignores any custom placeholders. If any custom column, custom filter value/condition, custom filter expression/having clause is used, it is completely ignored and the query is executed. If custom column is used, blanks are displayed. If custom filter condition or value is used, the filter is ignored completely.

  The below table shows the difference between the permissions of all these types:  

By default, the application comes with setting as “strict-metadata”. The security settings can be changed in the file defaults.properties located at …/hi/hi-repository/System/Admin/

There are separate placeholders where sql injection is possible – custom column, filter values/conditions, filter expressions, having clause, dashboard level.

The defaults.properties file has various properties for each of these placeholders that can be toggled and the security levels defined according to requirement.

Lines 68 and 70 defines which level of security needs to be set for the custom column and filter values.



By default when Helical Insight 4.0 GA onwards is installed it will be strict-metadata. If you are migrating from an older version you can change it to default on line no 68 and 70 so that your custom columns and other things work as is. Then you can start working on migrating it and making it compatible to strict-metadata to address the security related issues. In strict-metadata, in order to use SQL queries in the custom column or filter values, you need to use fully qualified column names. This means that you need to provide column name in the format –

<catalog/schema name>.<table name>.<column name>

If you want to restrict use of custom conditions or custom values in filter from the adhoc report interface, uncomment lines 72 and 73.



If you uncomment lines 72 and 73, the custom option will still be visible on the frontend. However, if we try to use the custom filter condition or custom filter value, then an error will be thrown.

When the settings are as shown below, we get an error if we use “Custom” condition/filter value.

If we use the below configuration, then custom conditions and custom filter values will work as long as the columns and tables being referred are part of the metadata in use.



In the filter expressions and having clause also, SQL queries can be used. However, by default, these are restricted by using “sql-detect” rule as shown in lines 75 and 77.



Sql-detect rule will use the sqlpattern property defined on line 94 to detect sql and throws error incase if that sql patternis detected defined in that file. By default, ALTER, CREATE, DELETE, DROP, EXECUTE, INSERT, MERGE, SELECT, UPDATE, UNION, FROM, WHERE, SLEEP keywords are restricted. sql-detect rule is applicable only for filter expression and having clause.

You can add more characters or keywords which will be checked by the sql-detect rule.



  If you are using JSON, you might want  to use complex json queries in custom columns for nested json like ‘schemaName’,’tableName’.’columnName’.’propertyName’[0]. In that case, change line number 83 to

strict-metadata-object = allow_complex_json_queries



By default, strict-metadata-object is set to default. This means that complex json queries cannot be used. If you are already using complex json queries in earlier version, set this value to “allow_complex_json_queries”. This has to be done for any database wherein datatype json is being used.

EFW files are created using canned reports module, dashboard designer or EFW CE reports. Using the console, the dashboard variables and other information can be accessed and the filter values modified. There is a possibility of SQL injection here as well.
In order to restrict this, line 88 has been uncommented by default. Here, sql-detect is used. Hence if any pattern defined in sqlpattern on line 94 is detected, an exception is thrown.

If you comment line 88 and uncomment line 90, then the EFW file SQL injection will work as in versions prior to 4.0 GA.Examples for Various Rules

Let us assume that there are there are 2 tables in the metadata.

Table T1 has columns C1, C2, C3 and Table T2 has columns C4, C5, C6

There is another table in the database schema which is excluded from the metadata.

Table T3 has columns C7, C8

  1. default

Change the rule for custom column and filter values to default. You can comment out lines 75 and 77, so no sql detection will take place for filter expressions and having clause.

Test Case for Custom Column.

Case 1: “T1”.”C1″
Here, T1 and C1 are parts of metadata and the custom column is executed without any issue

Case 2: Select “T1”.”C1″ from “T1” where “C1” = 1
Here, T1 and C1 are part of the metadata. The query is executed without any error.

Case 3: “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

Case 4: Select “T3”.”C7″ from “T3” where “C7” = 1
Here, T3 is not part of the metadata. However, it is part of the schema that is being used to create the metadata. With the select clause, we can execute the query even though the table is not part of the metadata.

Test Case for Filter Values


Case 1: “T1”.”C1″
Here, T1 and C1 are parts of metadata and the custom filter value is applied without any issue

Case 2: Select “T1”.”C1″ from “T1” where “C1” = 1
Here, T1 and C1 are part of the metadata. The query is executed and filter value applied without any error.

Case 3: “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

Case 4: Select “T3”.”C7″ from “T3” where “C7” = 1
Here, T3 is not part of the metadata. However, it is part of the schema that is being used to create the metadata. With the select clause, we can execute the query even though the table is not part of the metadata.

Test Case for Filter Expression/Having Clause


{0}.column = “T1”.”C1″
Here, T1 and C1 are parts of metadata and the custom filter expression is executed without any issue

{0}.column = Select “T1”.”C1″ from “T1” where “C1” = 1
Here, T1 and C1 are part of the metadata. The query is executed and filter expression is applied without any error.

{0}.column = “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

{0}.column =Select “T3”.”C7″ from “T3” where “C7” = 1
Here, T3 is not part of the metadata. However, it is part of the schema that is being used to create the metadata. With the select clause, we can execute the query even though the table is not part of the metadata.

Note: If filter expression and having clause has sql-detect enabled, then select statements will not work whether the column/table is within the metadata or outside of it.

Rule 2: strict-metadata
This is the security rule applied by default in Helical Insight 4.0GA onwards.
You can comment out lines 75 and 77, so no sql detection will take place for filter expressions and having clause.

Test Case for Custom Column.



Case 1: “T1”.”C1″
Here, T1 and C1 are parts of metadata and the custom column is executed without any issue if there is only one schema/catalog in the database.
If there are multiple catalogs/schemas in the database, you need to use the fully qualified name of the column for the query to be executed successfully.

Fully qualified column name format will be:

<catalogName>.<schemaName>.<tableName>.<columnName>

Case 2: Select “testSchema”.”T1″.”C1″ from “testSchema”.”T1″ where “testSchema”.”T1″.”C1″ = 1
Here, T1 and C1 are part of the metadata. The query is executed without any error if there is only a single schema/catalog in the database.

If there are multiple catalogs/schemas in the database, you need to use the fully qualified name of the column for the query to be executed successfully.

Fully qualified column name format will be:

<catalogName>.<schemaName>.<tableName>.<columnName>

Case 3: “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

Case 4: Select “testSchema”.”T3″.”C7″ from “testSchema”.”T3″ where “testSchema”.”T3”.”C7″ = 1
Here, T3 is not part of the metadata. Hence, even if we use the fully qualified query, we will not be able to access any columns from T3 and there will be an error thrown.

Test Case for Filter Values


Case 1: “T1”.”C1″
Here, T1 and C1 are parts of metadata and the filter value is applied without any issue if there is only one schema/catalog in the database.

If there are multiple catalogs/schemas in the database, you need to use the fully qualified name of the column for the query to be executed successfully.

Fully qualified column name format will be:

<catalogName>.<schemaName>.<tableName>.<columnName>

Case 2: Select “testSchema”.”T1″.”C1″ from “testSchema”.”T1″ where “testSchema”.”T1″.”C1″ = 1
Here, T1 and C1 are part of the metadata. The query is executed without any error if there is only a single schema/catalog in the database.

If there are multiple catalogs/schemas in the database, you need to use the fully qualified name of the column for the query to be executed successfully.

Fully qualified column name format will be:

<catalogName>.<schemaName>.<tableName>.<columnName>

Case 3: “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

Case 4: Select “testSchema”.”T3″.”C7″ from “testSchema”.”T3″ where “testSchema”.”T3”.”C7″ = 1
Here, T3 is not part of the metadata. Hence, even if we use the fully qualified query, we will not be able to access any columns from T3 and there will be an error thrown.

Test Case for Filter Expression/Having Clause


{0}.column = ”testSchema”. “T1”.”C1″
Here, T1 and C1 are parts of metadata and the custom filter expression is executed without any issue

{0}.column = Select “testSchema”.”T1″.”C1″ from “testSchema”.”T1″ where “testSchema”.”T1”.”C1″ = 1
Here, T1 and C1 are part of the metadata. The query is executed and filter expression is applied without any error.

{0}.column = “T3”.”C7″
Here, T3 is not part of the metadata and an error is thrown as the table and column are unidentified.

{0}.column =Select “T3”.”C7″ from “T3” where “C7” = 1
Here, T3 is not part of the metadata. Hence, even if fully qualified query is used, an error is thrown.

Note: If filter expression and having clause has sql-detect enabled, then select statements will not work whether the column/table is within the metadata or outside of it.

 
RULE 3: strict
Change the security rule to strict.

You can comment out lines 75 and 77, so no sql detection will take place for filter expressions and having clause.

When we use any “custom” placeholder or write “custom” code anywhere and rule in the defaults.properties file is strict, then we will get error as soon as the “custom” is detected in the jsonobject.



All the tests where we try to use Custom Column or Custom Filter Values or Custom Filter Expression/Having Clause, an error is thrown.

There is complete restriction in usage of custom placeholders.

RULE 4: stripped
Change the rule for custom column and filter values to stripped.

You can comment lines 75 and 77 to remove sql detection for filter expressions.



In case rule is stripped, then as soon as there is detection of “custom” in jsonobject, the part is completely ignored and query is executed without executing the part of the query mentioned in the “custom” sections.

If we use a Custom Column, the query will be executed, but the custom column will remain blank and no data will be displayed in that column.

If we use custom filter value or filter condition, the filter condition is totally ignored and the entire resultset is displayed (as if the filter was not applied).

This particular rule is not applicable for the filter expressions and having clause.

Below table shows the summary of the test cases for the various rules:



RULE 5: sql-detect
This rule is applicable only for filter expressions and having clause section.

This rule is especially useful to avoid any kind of sql injection using the filter expressions. As soon as sql-detect is enabled for filter expressions and having clause, then the sqlpattern is observed for the filter expression. If any of the patterns mentioned in sqlpattern are detected, an error is thrown.

Thanks,
Helical Insight

Helical Insight’s self-service capabilities is one to reckon with. It allows you to simply drag and drop columns, add filters, apply aggregate functions if required, and create reports and dashboards on the fly. For advanced users, the self-service component has ability to add javascript, HTML, HTML5, CSS, CSS3 and AJAX. These customizations allow you to create dynamic reports and dashboards. You can also add new charts inside the self-service component, add new kind of aggregate functions and customize it using our APIs.
Helical Insight’s self-service capabilities is one to reckon with. It allows you to simply drag and drop columns, add filters, apply aggregate functions if required, and create reports and dashboards on the fly. For advanced users, the self-service component has ability to add javascript, HTML, HTML5, CSS, CSS3 and AJAX. These customizations allow you to create dynamic reports and dashboards. You can also add new charts inside the self-service component, add new kind of aggregate functions and customize it using our APIs.
Helical Insight, via simple browser based interface of Canned Reporting module, also allows to create pixel perfect printer friendly document kind of reports also like Invoice, P&L Statement, Balance sheet etc.
Helical Insight, via simple browser based interface of Canned Reporting module, also allows to create pixel perfect printer friendly document kind of reports also like Invoice, P&L Statement, Balance sheet etc.
If you have a product, built on any platform like Dot Net or Java or PHP or Ruby, you can easily embed Helical Insight within it using iFrames or webservices, for quick value add through instant visualization of data.
If you have a product, built on any platform like Dot Net or Java or PHP or Ruby, you can easily embed Helical Insight within it using iFrames or webservices, for quick value add through instant visualization of data.
Being a 100% browser-based BI tool, you can connect with your database and analyse across any location and device. There is no need to download or install heavy memory-consuming developer tools – All you need is a Browser application! We are battle-tested on most of the commonly used browsers.
Being a 100% browser-based BI tool, you can connect with your database and analyse across any location and device. There is no need to download or install heavy memory-consuming developer tools – All you need is a Browser application! We are battle-tested on most of the commonly used browsers.
We have organization level security where the Superadmin can create, delete and modify roles. Dashboards and reports can be added to that organization. This ensures multitenancy.
We have organization level security where the Superadmin can create, delete and modify roles. Dashboards and reports can be added to that organization. This ensures multitenancy.
We have organization level security where the Superadmin can create, delete and modify roles. Dashboards and reports can be added to that organization. This ensures multitenancy.
We have organization level security where the Superadmin can create, delete and modify roles. Dashboards and reports can be added to that organization. This ensures multitenancy.
A first-of-its-kind Open-Source BI framework, Helical Insight is completely API-driven. This allows you to add functionalities, including but not limited to adding a new exporting type, new datasource type, core functionality expansion, new charting in adhoc etc., at any place whenever you wish, using your own in-house developers.
A first-of-its-kind Open-Source BI framework, Helical Insight is completely API-driven. This allows you to add functionalities, including but not limited to adding a new exporting type, new datasource type, core functionality expansion, new charting in adhoc etc., at any place whenever you wish, using your own in-house developers.
It handles huge volumes of data effectively. Caching, Pagination, Load-Balancing and In-Memory not only provides you with amazing experience, but also and does not burden the database server more than required. Further effective use of computing power gives best performance and complex calculations even on the big data even with smaller machines for your personal use. Filtering, Sorting, Cube Analysis, Inter Panel Communication on the dashboards all at lightning speed. Thereby, making best open-source Business Intelligence solution in the market.
It handles huge volumes of data effectively. Caching, Pagination, Load-Balancing and In-Memory not only provides you with amazing experience, but also and does not burden the database server more than required. Further effective use of computing power gives best performance and complex calculations even on the big data even with smaller machines for your personal use. Filtering, Sorting, Cube Analysis, Inter Panel Communication on the dashboards all at lightning speed. Thereby, making best open-source Business Intelligence solution in the market.
With advance NLP algorithm, business users simply ask questions like, “show me sales of last quarter”, “average monthly sales of my products”. Let the application give the power to users without knowledge of query language or underlying data architecture
With advance NLP algorithm, business users simply ask questions like, “show me sales of last quarter”, “average monthly sales of my products”. Let the application give the power to users without knowledge of query language or underlying data architecture
Our application is compatible with almost all databases, be it RDBMS, or columnar database, or even flat files like spreadsheets or csv files. You can even connect to your own custom database via JDBC connection. Further, our database connection can be switched dynamically based on logged in users or its organization or other parameters. So, all your clients can use the same reports and dashboards without worrying about any data security breech.
Our application is compatible with almost all databases, be it RDBMS, or columnar database, or even flat files like spreadsheets or csv files. You can even connect to your own custom database via JDBC connection. Further, our database connection can be switched dynamically based on logged in users or its organization or other parameters. So, all your clients can use the same reports and dashboards without worrying about any data security breech.
Our application can be installed on an in-house server where you have full control of your data and its security. Or on cloud where it is accessible to larger audience without overheads and maintenance of the servers. One solution that works for all.
Our application can be installed on an in-house server where you have full control of your data and its security. Or on cloud where it is accessible to larger audience without overheads and maintenance of the servers. One solution that works for all.
Different companies have different business processes that the existing BI tools do not encompass. Helical Insight permits you to design your own workflows and specify what functional module of BI gets triggered
Different companies have different business processes that the existing BI tools do not encompass. Helical Insight permits you to design your own workflows and specify what functional module of BI gets triggered