RandomSec/main/resources/schemas/TableSchemaValidator.json

214 lines
14 KiB
JSON
Raw Normal View History

data package metadata (#1398) * fix the appbundle issue #1209 * fix #1162 allow the JRE 9 * fix the package declarations * remove the _ from the method name * use the explicit scoping * remote extra ; * fix issued from codacy * fix issued from codacy * add preferences link to the index page * handle the empty user metadata * fix 'last modified' sorting issue #1307 * prevent overflow of the table. issue #1306 * add isoDateParser to sort the date * prevent overflow of the project index * remove sorter arrow for action columns * disable editing the internal metadata * adjust the width of the table * change MetaData to Metadata * change the filed name from rowNumber to rowCount * put back the incidently deleted gitignore * add double quote to prevent word splitting * UI improvement on metadata view and project list view * remove the date field in metadata * message notification of the free RAM. Issue #1295 * UI tuning for metadata view * shorten the ISO date to locale date format * Added translation using Weblate (Portuguese (Brazil)) * remove the rename link * Ignore empty language files introduced by Weblate * Add UI for Invert text filter * Backend support for Inverting Text search facets * Fix reset on text search facet * More succinct return statements * add tests for SetProjectMetadataCommand * Tidying up for Codacy * Added Tests for TextSearchFilter * Corrections for Codacy * More code tidy up * let the browser auto fit the table cell when resizing/zooming * fix import multiple excel with mulitple sheets issue #1328 * check if the project has the userMetadata * fix the unit test support multi files with multi tables for open office * prevent the same key for user metadata * replace _ with variable for exception * fix the no-undef issue * to adjust the width of transform dialog. issue #1332 * fix the row count refresh issue * extract method * move the log message * cosmatic changes for codacy * fix typo * bump to version 2.8 * .gitignore is now working * preview stage won't have the metadata populated, so protect NPE * Update README.md No more direct link to the last version tag, which will avoid having to think of updating the readme * refacotring the ProjectMetadata class * introduce the IMetadata interface * create submodule of dataschema * add back * setup lib for dataschema; upgrade the apache lang to lang3 * replace escape* functions from apache lang3 * replace the ProjectMetadata with IMetadata interface * add missing jars * set the IMetadata a field of Project * remove PreferenceStore out of Project model * fix test SetProjectMetadataCommandTests by casting * introdcue the AbstractMetadata * introdcue the AbstractMetadata * reorganize the metadata package * allow have mulitiple metadata for a project * support for mulitple metadata format * remove jdk7 since 'table schema' java implmentation only support jdk8+ * set execute permission for script * fix the Unit Test after Metadata refactoring * restore the apache lang2.5 since jetty 6.1.22 depend on it * add commons lang 2.5 jar * git submodule add https://github.com/frictionlessdata/datapackage-java * remove the metadata parameter from the ProjectManager.registerProject method * remove hashmap _projectsMetadata field from the ProjectManager and FileProjectManager * init the Project.metadataMap * fix Unit Test * restore the ProjectMetaData map to ProjectManager * put the ProjectMetaDta in place for ProjectManager and Project object * check null of singleton instead of create a constructor just for test * load the data package metadata * importing data package * importing data package * encapsulate the Package class into DataPackageMetadata * user _ to indicate the class fields * introduce base URL in order to download the data files * import data package UI and draft backend * import data package UI * fix typo * download the data set pointed from metadata resource * save and load the data package metadata * avoid magic string * package cleanup * set the java_version to 1.8 * set the min jdk to 1.8 * add the 3rd party src in the build.xml * skip the file selection page if only 1 DATA file * add files structure for json editor * seperate out the metadata file from the retrival file list * rename the OKF_METADATA to DATAPACKAGE_METADATA * clean up * implement GetMetadateCommand class * display the metadata in json format * git submodule update --remote --merge * adjust the setting after pulling from datapackage origin * fix the failed UT DateExtensionTests.testFetchCounts due to new json jar json-20160810.jar will complain: JSONObject["float"] not a string. * clean up the weird loop array syntax get complained * remove the unused constant * export in data package format * interface cleanup * fix UT * edit the metadata * add UT for SetMetadataCommand * fix UT for SetMetadataCommand * display the data package metadata link on the project index page * update submodule * log the exceptions * Ajv does not work properly, use the back end validation instead * enable the validation for jsoneditor * first draft of the data validation * create a map to hold the constraint and its handler * rename * support for minLength and maxLength from spec * add validate command * test the opeation instead of validate command * rename the UT * format the error message and push to the report * fix row number * add resource bundle for validator * inject the code of the constrains * make the StrSubstitutor works * extract the type and format information * add the customizedFormat to interface to allow format properly * get rid of magic string * take care of missing parts of the data package * implement RequiredConstraint * patch for number type * add max/min constraints * get the constrains directly from field * implement the PatternConstraint * suppress warning * fix the broken UT when expecting 2 digits fraction * handle the cast and type properly * fix the missing resource files for data package when run from command line * use the copy instead of copydir * add script for appveyor * update script for appveyor * do recursive clone * correct the git url * fix clone path * clone folder option does not work * will put another PR for this. delete for now * revert the interface method name * lazy loading the project data * disable the validate menu for now * add UT * assert UTs * add UT * fix #1386 * remove import * test the thread * Revert "test the thread" This reverts commit 779214160055afe3ccdcc18c57b0c7c72e87c824. * fix the URLCachingTest UT * define the template data package * tidy up the metadata interface * check the http response code * fix the package * display user friendly message when URL path is not reachable * populate the data package schema * Delete hs_err_pid15194.log * populate data package info * add username preference and it will be pulled as the creator of the metadata * undo the project.updateColumnChange() and start to introduce the fields into the existing core model * tightly integrate the data package metadata * tightly integrate the data package metadata for project level * remove the submodule * move the edit botton * clean up build * load the new property * load the project metadata * fix issues from codacy * remove unused fields and annotation * check the http response code firstly * import zipped data package * allow without keywords * process the zip data package from url * merge the tags * check store firstly * remove the table schema src * move the json schema files to schema dir * add comment * add comment * remove git moduels * add incidently deleted file * fix typo * remove SetMetadataCommand * revert change * merge from master
2018-02-02 14:24:19 +01:00
{
"version": "1.0.0",
"errors": {
"io-error": {
"name": "IO Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an IO Error of type {error_type}",
"description": "Data reading error because of IO error.\n\n How it could be resolved:\n - Fix path if it's not correct."
},
"http-error": {
"name": "HTTP Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an HTTP error with a status code of {status_code}",
"description": "Data reading error because of HTTP error.\n\n How it could be resolved:\n - Fix url link if it's not correct."
},
"source-error": {
"name": "Source Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source has not supported or has inconsistent contents; no tabular data can be extracted",
"description": "Data reading error because of not supported or inconsistent contents.\n\n How it could be resolved:\n - Fix data contents (e.g. change JSON data to array or arrays/objects).\n - Set correct source settings in {validator}."
},
"scheme-error": {
"name": "Scheme Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown scheme; no tabular data can be extracted",
"description": "Data reading error because of incorrect scheme.\n\n How it could be resolved:\n - Fix data scheme (e.g. change scheme from `ftp` to `http`).\n - Set correct scheme in {validator}."
},
"format-error": {
"name": "Format Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown format; no tabular data can be extracted",
"description": "Data reading error because of incorrect format.\n\n How it could be resolved:\n - Fix data format (e.g. change file extension from `txt` to `csv`).\n - Set correct format in {validator}."
},
"encoding-error": {
"name": "Encoding Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source could not be successfully decoded with {encoding} encoding",
"description": "Data reading error because of an encoding problem.\n\n How it could be resolved:\n - Fix data source if it's broken.\n - Set correct encoding in {validator}."
},
"blank-header": {
"name": "Blank Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is blank",
"description": "A column in the header row is missing a value. Column names should be provided.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data source.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `blank-header` check in {validator}."
},
"duplicate-header": {
"name": "Duplicate Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is duplicated to header in column(s) {column_numbers}",
"description": "Two columns in the header row have the same value. Column names should be unique.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `duplicate-header` check in {validator}."
},
"blank-row": {
"name": "Blank Row",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} is completely blank",
"description": "This row is empty. A row should contain at least one value.\n\n How it could be resolved:\n - Delete the row.\n - If this error should be ignored disable `blank-row` check in {validator}."
},
"duplicate-row": {
"name": "Duplicate Row",
"type": "structure",
"context": "body",
"weight": 5,
"message": "Row {row_number} is duplicated to row(s) {row_numbers}",
"description": "The exact same data has been seen in another row.\n\n How it could be resolved:\n - If some of the data is incorrect, correct it.\n - If the whole row is an incorrect duplicate, remove it.\n - If this error should be ignored disable `duplicate-row` check in {validator}."
},
"extra-value": {
"name": "Extra Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has an extra value in column {column_number}",
"description": "This row has more values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data has an extra comma between the values in this row.\n - If this error should be ignored disable `extra-value` check in {validator}."
},
"missing-value": {
"name": "Missing Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has a missing value in column {column_number}",
"description": "This row has less values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data is not missing a comma between the values in this row.\n - If this error should be ignored disable `missing-value` check in {validator}."
},
"schema-error": {
"name": "Table Schema Error",
"type": "schema",
"context": "table",
"weight": 15,
"message": "Table Schema error: {error_message}",
"description": "Provided schema is not valid.\n\n How it could be resolved:\n - Update schema descriptor to be a valid descriptor\n - If this error should be ignored disable schema checks in {validator}."
},
"non-matching-header": {
"name": "Non-Matching Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "Header in column {column_number} doesn't match field name {field_name} in the schema",
"description": "One of the data source headers doesn't match the field name defined in the schema.\n\n How it could be resolved:\n - Rename header in the data source or field in the schema\n - If this error should be ignored disable `non-matching-header` check in {validator}."
},
"extra-header": {
"name": "Extra Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is an extra header in column {column_number}",
"description": "The first row of the data source contains header that doesn't exist in the schema.\n\n How it could be resolved:\n - Remove the extra column from the data source or add the missing field to the schema\n - If this error should be ignored disable `extra-header` check in {validator}."
},
"missing-header": {
"name": "Missing Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is a missing header in column {column_number}",
"description": "Based on the schema there should be a header that is missing in the first row of the data source.\n\n How it could be resolved:\n - Add the missing column to the data source or remove the extra field from the schema\n - If this error should be ignored disable `missing-header` check in {validator}."
},
"type-or-format-error": {
"name": "Type or Format Error",
"type": "schema",
"context": "body",
"weight": 9,
"message": "The value {value} in row {row_number} and column {column_number} is not type {field_type} and format {field_format}",
"description": "The value does not match the schema type and format for this field.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If this value is correct, adjust the type and/or format.\n - To ignore the error, disable the `type-or-format-error` check in {validator}. In this case all schema checks for row values will be ignored."
},
"required-constraint": {
"name": "Required Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Column {column_number} is a required field, but row {row_number} has no value",
"description": "This field is a required field, but it contains no value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove the `required` constraint from the schema.\n - If this error should be ignored disable `required-constraint` check in {validator}."
},
"pattern-constraint": {
"name": "Pattern Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the pattern constraint of {constraint}",
"description": "This field value should conform to constraint pattern.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `pattern` constraint in the schema.\n - If this error should be ignored disable `pattern-constraint` check in {validator}."
},
"unique-constraint": {
"name": "Unique Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Rows {row_numbers} has unique constraint violation in column {column_number}",
"description": "This field is a unique field but it contains a value that has been used in another row.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then the values in this column are not unique. Remove the `unique` constraint from the schema.\n - If this error should be ignored disable `unique-constraint` check in {validator}."
},
"enumerable-constraint": {
"name": "Enumerable Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the given enumeration: {constraint}",
"description": "This field value should be equal to one of the values in the enumeration constraint.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `enum` constraint in the schema.\n - If this error should be ignored disable `enumerable-constraint` check in {validator}."
},
"minimum-constraint": {
"name": "Minimum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum constraint of {constraint}",
"description": "This field value should be greater or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimum` constraint in the schema.\n - If this error should be ignored disable `minimum-constraint` check in {validator}."
},
"maximum-constraint": {
"name": "Maximum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum constraint of {constraint}",
"description": "This field value should be less or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximum` constraint in the schema.\n - If this error should be ignored disable `maximum-constraint` check in {validator}."
},
"minimum-length-constraint": {
"name": "Minimum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum length constraint of {constraint}",
"description": "A lenght of this field value should be greater or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimumLength` constraint in the schema.\n - If this error should be ignored disable `minimum-length-constraint` check in {validator}."
},
"maximum-length-constraint": {
"name": "Maximum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum length constraint of {constraint}",
"description": "A lenght of this field value should be less or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximumLength` constraint in the schema.\n - If this error should be ignored disable `maximum-length-constraint` check in {validator}."
}
}
}