**Contents:**

This section describes techniques to normalize numeric values in your datasets. Ideally, your source systems are configured to capture and deliver data using a consistent set of units in a standardized structure and format. In practice, data from multiple systems can illuminate differences in the level of precision used in numeric data or differences in text entries that reference the same thing. Within Trifacta® Wrangler Enterprise, you can use the following techniques to address some of the issues you might encounter in the standardization of units and values for numeric types.

## Standardize decimal precision

If decimal values in a column are of varying levels of precision, you can standardize to a single level of precision.

**Steps:**

- From the column menu, select
**Column Details**. .In the Column Details panel, select the Patterns tab.Among the patterns, select the following:

{digit}.{digit}

- In the Suggestions panel on the right, locate the
`set`

transform suggestion that uses the`ROUND`

function. Click**Edit**. - Change the second parameter of the
`ROUND`

function to match the number of digits of precision.

You can generalize this formatting across multiple columns by applying the `$col`

reference in the transform's function, as in the following:

set col:colA, colB, colC value: IFVALID($col, ['Float'], ROUND($col, 2))

See Column Details Panel.For more information, see ROUND Function.

## Standardize units

**Tip: ** Each column that contains numeric values should have an identified unit of measurement. Ideally, this information is embedded in the name of the column data. If the unit of measurement is not included, it can be difficult to properly interpret the data.

Trifacta Wrangler Enterprise does not impose any units on imported data. For example, a column of values in floating point format could represent centimeters, ounces, or any other unit of measurement. As long as the data conforms to the specified data type for the column, then Trifacta Wrangler Enterprise can work with it.

However, this flexibility can present issues for users of the dataset. If data is not clearly labeled and converted to a standardized set of units, its users are forced to make assumptions about the data, which can lead to misuse of it.

**Tip: **The meaning of some units of measure can change over time. For example, a US Dollar in 2010 does not have the same value as a dollar in 2015. When you standardize shifting units of measure, you should account for any time-based differences, if possible.

### Example - Fixed conversion factors

In many cases, units can be converted to other units by applying a fixed conversion factor to a column of data. For example, your dataset has the following three columns of measured data:

Person | Height_ft | Weight_kg | Arm_Length_in |
---|---|---|---|

Jack | 5'10" | 92 kg | 32 |

Jill | 5'2" | 56 kg | 29 |

Joe | 6'3" | 101 kg | 35 |

The above data has the following issues:

- The Weight and Height columns contain unit identifiers, which forces the values to be treated as strings.
- Metric data (kg) is mixed with English unit data (ft and in).
- The Height data is non-numeric.

**Problem 1 - remove units**

The `Weight_kg`

column contains a unit identifier. On import, these values are treated as strings, which limits their use for analysis.

**Steps:**

- In the data grid, select an instance of " kg". Note that the space should be selected, too.
- Among the suggestion cards, select the Replace card.
- It should automatically choose to replace with nothing, effectively deleting the content. To check, click
**Modify**. The transform should look like the following:

replace col: Weight_kg on: ` kg` with: '' global: true

- Add it to your recipe.
- Verify that the column's data type has been changed to
`Integer`

or`Decimal`

, depending on the values in it.

**Problem 2 - convert English to metric units**

To normalize to English units, the first issue is easily corrected by multiplying the Weight values by 2.2, since 1 kg = 2.2 lb:

set col:Weight_kg value:(Weight_kg * 2.2)

If you want to round the value to the nearest integer, use the following:

set col:Weight_kg value:ROUND((Weight_kg * 2.2))

After the above is added to the recipe, you should rename the column: `Weight_lbs`

.

**Problem 3 - convert ft/in to in**

The final issue involves converting the `Height_ft`

values to a single value for inches, so that these values can be used consistently with the other columns in the dataset.

On import, your data for the column might actually look like the following:

Height_ft |
---|

"5'10"" |

"5'2"" |

"6'3"" |

**Steps:**

- Select the first quote mark in one of the entries.
- In the suggestion cards, select the Replace card.
Select the variant that deletes all quotes in the column.

The full command should look like the following:

replace col: Height_ft on: `"` with: '' global: true

- Add it to your recipe.
- The remaining steps compute the number of inches. Multiply the feet by 12, and then add the number of inches, using new columns of data.
- Select the single quote mark, and choose the Split suggestion card. This transform step should split the column into two columns:
`Height_ft1`

and`Height_ft2`

. Derive the value in inches:

derive type:single value:((Height_ft1 *12)+Height_ft2)

- Add it to your recipe.
- Rename the new column:
`Height_in`

. - You can drop the other, interim columns.

### Dynamic conversion factors

In some cases, the conversion rate between two different units of measures is dynamic. A common example involves mismatches between currency. For example, one dataset can be using U.S. dollars while another represents values in Euros.

#### Within a column

If you have inconsistent units within a column, it might be possible to correct these values by applying a multiple. For example, you might be able to determine that some values are in kilometers, instead of meters, based on their much smaller values. Multiplying the kilometer values by 1000 should standardize your units. The following multiplies all values in the column `Distance`

that are less than 1000 by 1000.

set col:Distance value:(Distance < 1000 ? (Distance * 1000) : Distance)

Note the implied assumption that there are no distances in kilometers that are over 1000.

**NOTE: **Inconsistency in units within a column indicates a problem in either the source data or how the column data was modified after import. Where possible, you should try to fix these issues in the source data first, as they can introduce problems when the data is used.

## Adjust level of precision

For numeric values that are used for measurement, you can adjust the level of precision within and across columns of values. For example, you have the following columns of data:

Name | Width_cm | Height_cm |
---|---|---|

Object 1 | 23.3 | 55.5512 |

Object 2 | 65.2 | 102.4024 |

Object 3 | 54.2 | 12.22 |

In the above, you can see the following precision mismatches:

- The Height column contains one value with only two digits of arithmetic precision in measurement.
- The Width column uses two digits of arithmetic precision, while the Height column contains more digits of precision.

Where precision in measurement is important, you should consider rounding to the lowest level of precision. In this case, within the Height column, that level is to two significant digits after the decimal point (e.g. `12.22`

). However, across all of the columns of the dataset, the level of precision is to one significant digit after the decimal point, as the Width values are all restricted to this level of precision. While you could choose to round off to four digits across all columns, the extra values of `0`

do not accurately reflect measurement and are therefore misleading.

You can use the following transforms to perform rounding functions within these columns:

set col:Width_cm value:NUMFORMAT(Width_cm '#.#')

set col:Height_cm value:NUMFORMAT(Height_cm '#.#')

**NOTE: **The above assumes that the number of significant digits remains fixed in the source data. If this varies over times or uses of the transform recipe, you might need to revisit these specific transform steps.** **

**NOTE: **The above formatting option drops the zero for values like `4.0`

. As an alternative, you can use a format of `'#.0'`

, which always inserts a zero, even in cases where the zero is not present.

**Results:**

Name | Width_cm | Height_cm |
---|---|---|

Object 1 | 23.3 | 55.5 |

Object 2 | 65.2 | 102.4 |

Object 3 | 54.2 | 12.2 |

### Adjust data granularity by aggregation

For data hierarchies, you can use aggregations to adjust the granularity of your data to the appropriate grouping level. For example, you want to join a dataset that is organized by individual products with a dataset that is organized by brand. In most cases, you should aggregate the product-level data in the first dataset to the brand level.

**NOTE: **When aggregation is applied, a new table of data is generated with the columns that you specifically select for inclusion.

For more information, see Pivot Transform.

This page has no comments.