data package metadata (#1398)

* fix the appbundle issue #1209

* fix #1162

allow the JRE 9

* fix the package declarations

* remove the _ from the method name

* use the explicit scoping

* remote extra ;

* fix issued from codacy

* fix issued from codacy

* add preferences link to the index page

* handle the empty user metadata

* fix 'last modified' sorting issue #1307

* prevent overflow of the table. issue #1306

* add isoDateParser to sort the date

* prevent overflow of the project index

* remove sorter arrow for action columns

* disable editing the internal metadata

* adjust the width of the table

* change MetaData to Metadata

* change the filed name from rowNumber to rowCount

* put back the incidently deleted gitignore

* add double quote to prevent word splitting

* UI improvement on metadata view and project list view

* remove the date field in metadata

* message notification of the free RAM. Issue #1295

* UI tuning for metadata view

* shorten the ISO date to locale date format

* Added translation using Weblate (Portuguese (Brazil))

* remove the rename link

* Ignore empty language files introduced by Weblate

* Add UI for Invert text filter

* Backend support for Inverting Text search facets

* Fix reset on text search facet

* More succinct return statements

* add tests for SetProjectMetadataCommand

* Tidying up for Codacy

* Added Tests for TextSearchFilter

* Corrections for Codacy

* More code tidy up

* let the browser auto fit the table cell when resizing/zooming

* fix import multiple excel with mulitple sheets issue #1328

* check if the project has the userMetadata

* fix the unit test
support multi files with multi tables for open office

* prevent the same key for user metadata

* replace _ with variable for exception

* fix the no-undef issue

* to adjust the width of transform dialog. issue #1332

* fix the row count refresh issue

* extract method

* move the log message

* cosmatic changes for codacy

* fix typo

* bump to version 2.8

* .gitignore is now working

* preview stage won't have the metadata populated, so protect NPE

* Update README.md

No more direct link to the last version tag, which will avoid having to think of updating the readme

* refacotring the ProjectMetadata class

* introduce the IMetadata interface

* create submodule of dataschema

* add back

* setup lib for dataschema; upgrade the apache lang to lang3

* replace escape* functions from apache lang3

* replace the ProjectMetadata with IMetadata interface

* add missing jars

* set the IMetadata a field of Project

* remove PreferenceStore out of Project model

* fix test SetProjectMetadataCommandTests by casting

* introdcue the AbstractMetadata

* introdcue the AbstractMetadata

* reorganize the metadata package

* allow have mulitiple metadata for a project

* support for mulitple metadata format

* remove jdk7 since 'table schema' java implmentation only support jdk8+

* set execute permission for script

* fix the Unit Test after Metadata refactoring

* restore the apache lang2.5 since jetty 6.1.22 depend on it

* add commons lang 2.5 jar

* git submodule add  https://github.com/frictionlessdata/datapackage-java

* remove the metadata parameter from the ProjectManager.registerProject method

* remove hashmap _projectsMetadata field from the ProjectManager and FileProjectManager

* init the Project.metadataMap

* fix Unit Test

* restore the ProjectMetaData map to ProjectManager

* put the ProjectMetaDta in place for ProjectManager and Project object

* check null of singleton instead of create a constructor just for test

* load the data package metadata

* importing data package

* importing data package

* encapsulate the Package class into DataPackageMetadata

* user _ to indicate the class fields

* introduce base URL in order to download the data files

* import data package UI and draft backend

* import data package UI

* fix typo

* download the data set pointed from metadata resource

* save and load the data package metadata

* avoid magic string

* package cleanup

* set the java_version to 1.8

* set the min jdk to 1.8

* add the 3rd party src in the build.xml

* skip the file selection page if only 1 DATA file

* add files structure for json editor

* seperate out the metadata file from the retrival file list

* rename the OKF_METADATA to DATAPACKAGE_METADATA

* clean up

* implement GetMetadateCommand class

* display the metadata in json format

* git submodule update --remote --merge

* adjust the setting after pulling from datapackage origin

* fix the failed UT DateExtensionTests.testFetchCounts due to new json jar json-20160810.jar will complain: JSONObject["float"] not a string.

* clean up the weird loop array syntax get complained

* remove the unused constant

* export in data package format

* interface cleanup

* fix UT

* edit the metadata

* add UT for SetMetadataCommand

* fix UT for SetMetadataCommand

* display the data package metadata link on the project index page

* update submodule

* log the exceptions

* Ajv does not work properly, use the back end validation instead

* enable the validation for jsoneditor

* first draft of the data validation

* create a map to hold the constraint and its handler

* rename

* support for minLength and maxLength from spec

* add validate command

* test the opeation instead of validate command

* rename the UT

* format the error message and push to the report

* fix row number

* add resource bundle for validator

* inject the code of the constrains

* make the StrSubstitutor works

* extract the type and format information

* add the customizedFormat to interface to allow format properly

* get rid of magic string

* take care of missing parts of the data package

* implement RequiredConstraint

* patch for number type

* add max/min constraints

* get the constrains directly from field

* implement the PatternConstraint

* suppress warning

* fix the broken UT when expecting 2 digits fraction

* handle the cast and type properly

* fix the missing resource files for data package when run from command line

* use the copy instead of copydir

* add script for appveyor

* update script for appveyor

* do recursive clone

* correct the git url

* fix clone path

* clone folder option does not work

* will put another PR for this. delete for now

* revert the interface method name

* lazy loading the project data

* disable the validate menu for now

* add UT

* assert UTs

* add UT

* fix #1386

* remove import

* test the thread

* Revert "test the thread"

This reverts commit 779214160055afe3ccdcc18c57b0c7c72e87c824.

* fix the URLCachingTest UT

* define the template data package

* tidy up the metadata interface

* check the http response code

* fix the package

* display user friendly message when URL path is not reachable

* populate the data package schema

* Delete hs_err_pid15194.log

* populate data package info

* add username  preference and it will be pulled as the creator of the metadata

* undo the project.updateColumnChange() and start to introduce the fields into the existing core model

* tightly integrate the data package metadata

* tightly integrate the data package metadata for project level

* remove the submodule

* move the edit botton

* clean up build

* load the new property

* load the project metadata

* fix issues from codacy

* remove unused fields and annotation

* check the http response code firstly

* import zipped data package

* allow without keywords

* process the zip data package from url

* merge the tags

* check store firstly

* remove the table schema src

* move the json schema files to schema dir

* add comment

* add comment

* remove git moduels

* add incidently deleted file

* fix typo

* remove SetMetadataCommand

* revert change

* merge from master
This commit is contained in:
Jacky 2018-02-02 08:24:19 -05:00 committed by Antonin Delpeuch
parent cd58557424
commit c4b0ff6bea
222 changed files with 56544 additions and 1240 deletions

View File

@ -1,6 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="main/src"/>
<classpathentry kind="src" path="main/resources"/>
<classpathentry kind="src" path="extensions/jython/tests/src"/>
<classpathentry kind="src" path="server/src"/>
<classpathentry kind="src" path="extensions/gdata/src"/>
@ -14,15 +15,9 @@
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/arithcode-1.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/butterfly-1.0.1.jar" sourcepath="main/webapp/WEB-INF/lib-src/butterfly-1.0.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/clojure-1.5.1-slim.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-codec-1.6.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-io-1.4.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-lang-2.5.jar" sourcepath="main/webapp/WEB-INF/lib-src/commons-lang-2.5-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/dom4j-1.6.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jrdf-0.5.6.jar" sourcepath="main/webapp/WEB-INF/lib-src/jrdf-0.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="main/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/log4j-1.2.15.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/marc4j-2.4.jar"/>
@ -46,8 +41,6 @@
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-core-1.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-spreadsheet-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jsoup-1.4.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/tests/server/lib/mockito-all-1.9.5.jar"/>
<classpathentry exported="true" kind="lib" path="main/tests/server/lib/testng-6.8.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-meta-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-media-1.0.jar"/>
@ -67,7 +60,6 @@
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/signpost-core-1.2.1.2.jar" sourcepath="main/webapp/WEB-INF/lib-src/signpost-core-1.2.1.2-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/guava-13.0.jar"/>
<classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/jsr305-1.3.9.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-logging-1.1.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/fluent-hc-4.2.5.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/httpmime-4.2.5.jar"/>
<classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/commons-logging-1.1.1.jar"/>
@ -89,6 +81,21 @@
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-annotations-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-core-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-databind-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/json-20160810.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-beanutils-1.9.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-collections-3.2.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-digester-1.8.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-lang3-3.6.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-logging-1.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-text-1.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-validator-1.5.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/hamcrest-all-1.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/icu4j-4.2.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/joda-time-2.9.9.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/opencsv-4.0.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/org.everit.json.schema-1.5.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-lang-2.5.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/mysql-connector-java-5.1.44-bin.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/json-simple-1.1.1.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-mapper-asl-1.9.13.jar"/>
@ -97,5 +104,20 @@
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-core-asl-1.9.13.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jasypt-1.9.2.jar"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-csv-1.5.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/junit-4.12.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/bsh-2.0b4.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/byte-buddy-1.6.14.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/byte-buddy-agent-1.6.14.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/cglib-nodep-2.2.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/hamcrest-core-1.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/javassist-3.21.0-GA.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jcommander-1.48.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/mockito-core-2.8.9.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/objenesis-2.5.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/powermock-mockito2-1.7.1-full.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/testng-6.9.10.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/datapackage-java-1.0-SNAPSHOT.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/tableschema-java-1.0-SNAPSHOT.jar"/>
<classpathentry kind="output" path="main/webapp/WEB-INF/classes"/>
</classpath>

View File

@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry excluding="build/**|main/webapp/modules/core/MOD-INF/controller.js|main/webapp/modules/core/externals/|test-output/" kind="src" path=""/>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -57,6 +57,7 @@ org.eclipse.wst.jsdt.core.compiler.problem.unusedParameterIncludeDocCommentRefer
org.eclipse.wst.jsdt.core.compiler.problem.unusedParameterWhenImplementingAbstract=disabled
org.eclipse.wst.jsdt.core.compiler.problem.unusedPrivateMember=warning
org.eclipse.wst.jsdt.core.compiler.source=1.3
org.eclipse.wst.jsdt.core.compiler.source.type=script
org.eclipse.wst.jsdt.core.compiler.taskCaseSensitive=enabled
org.eclipse.wst.jsdt.core.compiler.taskPriorities=NORMAL,HIGH,NORMAL
org.eclipse.wst.jsdt.core.compiler.taskTags=TODO,FIXME,XXX
@ -318,4 +319,5 @@ org.eclipse.wst.jsdt.core.formatter.tabulation.char=space
org.eclipse.wst.jsdt.core.formatter.tabulation.size=4
org.eclipse.wst.jsdt.core.formatter.use_tabs_only_for_leading_indentations=false
org.eclipse.wst.jsdt.core.formatter.wrap_before_binary_operator=true
semanticValidation=enabled
semanticValidation=disabled
strictOnKeywordUsage=disabled

View File

@ -0,0 +1,8 @@
DELEGATES_PREFERENCE=delegateValidatorList
USER_BUILD_PREFERENCE=enabledBuildValidatorList
USER_MANUAL_PREFERENCE=enabledManualValidatorList
USER_PREFERENCE=overrideGlobalPreferencestruedisableAllValidationtrueversion1.2.700.v201508251749
eclipse.preferences.version=1
override=true
suspend=true
vf.version=3

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="com.google.appengine.eclipse.core.GAE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-trunk.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/butterfly-trunk.jar"/>
<classpathentry kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpclient-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpclient-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpcore-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpcore-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-api-1.5.6-sources.jar"/>
<classpathentry combineaccessrules="false" kind="src" path="/grefine-broker"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,43 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-appengine-broker</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/com.google.gdt.eclipse.core.webAppProjectValidator.launch</value>
</dictionary>
</arguments>
</buildCommand>
<buildCommand>
<name>com.google.appengine.eclipse.core.enhancerbuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/com.google.appengine.eclipse.core.projectValidator.launch</value>
</dictionary>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>com.google.appengine.eclipse.core.gaeNature</nature>
</natures>
</projectDescription>

View File

@ -1,23 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="src" path="tests/src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-trunk.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/butterfly-trunk.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/bdb-je-4.0.103.jar" sourcepath="module/MOD-INF/lib-src/bdb-je-4.0.103-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpclient-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpclient-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpcore-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpcore-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-api-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/tests/server/lib/mockito-all-1.8.4.jar" sourcepath="/grefine/tests/server/lib-src/mockito-all-1.8.4-sources.jar"/>
<classpathentry kind="lib" path="/grefine/tests/server/lib/testng-5.12.1.jar" sourcepath="/grefine/tests/server/lib-src/testng-5.12.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/log4j-1.2.15.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/log4j-1.2.15-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-log4j12-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-log4j12-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/jcl-over-slf4j-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-broker</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -140,6 +140,10 @@
<classpath refid="webapp.class.path" />
</javac>
<copy file="${webapp.src.dir}/log4j.properties" tofile="${webapp.classes.dir}/log4j.properties"/>
<copy file="${main.dir}/resources/schemas/datapackage-template.json" tofile="${webapp.classes.dir}/schemas/datapackage-template.json"/>
<copy file="${main.dir}/resources/schemas/TableSchemaValidator.json" tofile="${webapp.classes.dir}/schemas/TableSchemaValidator.json"/>
<copy file="${webapp.src.dir}/validator-resource-bundle.properties" tofile="${webapp.classes.dir}/validator-resource-bundle.properties"/>
<copy file="${webapp.src.dir}/log4j.properties" tofile="${webapp.classes.dir}/log4j.properties"/>
</target>
<target name="build_tests" depends="build">

View File

@ -46,7 +46,6 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.commands.HttpUtilities;
import com.google.refine.extension.database.model.DatabaseColumn;
@ -56,6 +55,7 @@ import com.google.refine.importing.ImportingController;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingManager;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities;

View File

@ -25,7 +25,7 @@ import org.testng.annotations.Parameters;
import org.testng.annotations.Test;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.extension.database.mysql.MySQLDatabaseService;
import com.google.refine.extension.database.stub.RefineDbServletStub;

View File

@ -25,7 +25,7 @@ import org.testng.annotations.Parameters;
import org.testng.annotations.Test;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.extension.database.DBExtensionTestUtils;
import com.google.refine.extension.database.DBExtensionTests;

View File

@ -1,32 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7">
<attributes>
<attribute name="owner.project.facets" value="java"/>
</attributes>
</classpathentry>
<classpathentry combineaccessrules="false" kind="src" path="/grefine"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-core-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-spreadsheet-3.0.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-1.0.1.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/jackson-core-asl-1.9.12.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-base-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-client-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-client-meta-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-docs-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-docs-meta-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-media-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-spreadsheet-meta-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/jsr305-1.3.9.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/mail.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-client-servlet-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-services-drive-v2-rev168-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-services-fusiontables-v2-rev3-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-http-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-http-client-jackson2-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-oauth-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-oauth-client-servlet-1.20.0.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,31 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-gdata-extension</name>
<comment></comment>
<projects>
<project>gridworks</project>
<project>gridworks-server</project>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -1 +0,0 @@
/classes/

View File

@ -39,11 +39,11 @@ import com.google.api.services.fusiontables.model.Column;
import com.google.api.services.fusiontables.model.Sqlresponse;
import com.google.api.services.fusiontables.model.Table;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importers.TabularImportingParserBase.TableDataReader;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
/**

View File

@ -45,11 +45,11 @@ import com.google.gdata.data.spreadsheet.SpreadsheetEntry;
import com.google.gdata.data.spreadsheet.WorksheetEntry;
import com.google.gdata.util.ServiceException;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importers.TabularImportingParserBase.TableDataReader;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
/**

View File

@ -65,7 +65,6 @@ import com.google.gdata.util.AuthenticationException;
import com.google.gdata.util.ServiceException;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.commands.HttpUtilities;
import com.google.refine.importing.DefaultImportingController;
@ -73,6 +72,7 @@ import com.google.refine.importing.ImportingController;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingManager;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities;

View File

@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry combineaccessrules="false" kind="src" path="/grefine"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/jython-standalone-2.7.1.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-jython</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
<classpathentry combineaccessrules="false" kind="src" path="/OpenRefine"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>refine-pd-extension</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -1 +0,0 @@
/classes/

View File

@ -41,10 +41,10 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class PCAxisImporter extends TabularImportingParserBase {

View File

@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,29 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-sample-extension</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -1 +0,0 @@
/classes/

View File

@ -1,46 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="src" path="tests/server/src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/butterfly-1.0.1.jar" sourcepath="webapp/WEB-INF/lib-src/butterfly-1.0.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-codec-1.6.jar" sourcepath="webapp/WEB-INF/lib-src/commons-codec-1.6-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-lang-2.5.jar" sourcepath="webapp/WEB-INF/lib-src/commons-lang-2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar" sourcepath="webapp/WEB-INF/lib-src/commons-fileupload-1.2.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/json-20100208.jar" sourcepath="webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/icu4j-4.2.1.jar" sourcepath="webapp/WEB-INF/lib-src/icu4j-4.2.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/arithcode-1.1.jar" sourcepath="webapp/WEB-INF/lib-src/arithcode-1.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/secondstring-20100303.jar" sourcepath="webapp/WEB-INF/lib-src/secondstring-20100303-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/ant-tools-1.8.0.jar" sourcepath="webapp/WEB-INF/lib-src/ant-tools-1.8.0-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/vicino-1.1.jar" sourcepath="webapp/WEB-INF/lib-src/vicino-1.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/opencsv-2.4-SNAPSHOT.jar" sourcepath="tests/java/lib-src/opencsv-2.4-SNAPSHOT-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar" sourcepath="webapp/WEB-INF/lib-src/jcl-over-slf4j-1.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="webapp/WEB-INF/lib/slf4j-api-1.5.6.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/slf4j-log4j12-1.5.6.jar" sourcepath="webapp/WEB-INF/lib-src/slf4j-log4j12-1.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/log4j-1.2.15.jar" sourcepath="webapp/WEB-INF/lib-src/log4j-1.2.15-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/dom4j-1.6.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/xmlbeans-2.3.0.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/clojure-1.5.1-slim.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jackson-core-asl-1.9.12.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/marc4j-2.4.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jrdf-0.5.6.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-io-1.4.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry kind="lib" path="tests/server/lib/mockito-all-1.9.5.jar" sourcepath="tests/server/lib-src/mockito-all-1.9.5-sources.jar"/>
<classpathentry kind="lib" path="tests/server/lib/testng-6.8.jar" sourcepath="tests/server/lib-src/testng-6.8-sources.jar"/>
<classpathentry exported="true" kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/httpclient-4.2.5.jar" sourcepath="webapp/WEB-INF/lib-src/httpclient-4.2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/httpcore-4.2.4.jar" sourcepath="webapp/WEB-INF/lib-src/httpcore-4.2.4-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/signpost-commonshttp4-1.2.1.2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/signpost-core-1.2.1.2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jsoup-1.4.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/odfdom-java-0.8.7.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/guava-13.0.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-3.13-20150929.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-ooxml-3.13-20150929.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-ooxml-schemas-3.13-20150929.jar"/>
<classpathentry kind="output" path="webapp/WEB-INF/classes"/>
</classpath>

1
main/.gitignore vendored
View File

@ -1 +0,0 @@
/test-output

View File

@ -1,29 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -0,0 +1,213 @@
{
"version": "1.0.0",
"errors": {
"io-error": {
"name": "IO Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an IO Error of type {error_type}",
"description": "Data reading error because of IO error.\n\n How it could be resolved:\n - Fix path if it's not correct."
},
"http-error": {
"name": "HTTP Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an HTTP error with a status code of {status_code}",
"description": "Data reading error because of HTTP error.\n\n How it could be resolved:\n - Fix url link if it's not correct."
},
"source-error": {
"name": "Source Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source has not supported or has inconsistent contents; no tabular data can be extracted",
"description": "Data reading error because of not supported or inconsistent contents.\n\n How it could be resolved:\n - Fix data contents (e.g. change JSON data to array or arrays/objects).\n - Set correct source settings in {validator}."
},
"scheme-error": {
"name": "Scheme Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown scheme; no tabular data can be extracted",
"description": "Data reading error because of incorrect scheme.\n\n How it could be resolved:\n - Fix data scheme (e.g. change scheme from `ftp` to `http`).\n - Set correct scheme in {validator}."
},
"format-error": {
"name": "Format Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown format; no tabular data can be extracted",
"description": "Data reading error because of incorrect format.\n\n How it could be resolved:\n - Fix data format (e.g. change file extension from `txt` to `csv`).\n - Set correct format in {validator}."
},
"encoding-error": {
"name": "Encoding Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source could not be successfully decoded with {encoding} encoding",
"description": "Data reading error because of an encoding problem.\n\n How it could be resolved:\n - Fix data source if it's broken.\n - Set correct encoding in {validator}."
},
"blank-header": {
"name": "Blank Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is blank",
"description": "A column in the header row is missing a value. Column names should be provided.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data source.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `blank-header` check in {validator}."
},
"duplicate-header": {
"name": "Duplicate Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is duplicated to header in column(s) {column_numbers}",
"description": "Two columns in the header row have the same value. Column names should be unique.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `duplicate-header` check in {validator}."
},
"blank-row": {
"name": "Blank Row",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} is completely blank",
"description": "This row is empty. A row should contain at least one value.\n\n How it could be resolved:\n - Delete the row.\n - If this error should be ignored disable `blank-row` check in {validator}."
},
"duplicate-row": {
"name": "Duplicate Row",
"type": "structure",
"context": "body",
"weight": 5,
"message": "Row {row_number} is duplicated to row(s) {row_numbers}",
"description": "The exact same data has been seen in another row.\n\n How it could be resolved:\n - If some of the data is incorrect, correct it.\n - If the whole row is an incorrect duplicate, remove it.\n - If this error should be ignored disable `duplicate-row` check in {validator}."
},
"extra-value": {
"name": "Extra Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has an extra value in column {column_number}",
"description": "This row has more values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data has an extra comma between the values in this row.\n - If this error should be ignored disable `extra-value` check in {validator}."
},
"missing-value": {
"name": "Missing Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has a missing value in column {column_number}",
"description": "This row has less values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data is not missing a comma between the values in this row.\n - If this error should be ignored disable `missing-value` check in {validator}."
},
"schema-error": {
"name": "Table Schema Error",
"type": "schema",
"context": "table",
"weight": 15,
"message": "Table Schema error: {error_message}",
"description": "Provided schema is not valid.\n\n How it could be resolved:\n - Update schema descriptor to be a valid descriptor\n - If this error should be ignored disable schema checks in {validator}."
},
"non-matching-header": {
"name": "Non-Matching Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "Header in column {column_number} doesn't match field name {field_name} in the schema",
"description": "One of the data source headers doesn't match the field name defined in the schema.\n\n How it could be resolved:\n - Rename header in the data source or field in the schema\n - If this error should be ignored disable `non-matching-header` check in {validator}."
},
"extra-header": {
"name": "Extra Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is an extra header in column {column_number}",
"description": "The first row of the data source contains header that doesn't exist in the schema.\n\n How it could be resolved:\n - Remove the extra column from the data source or add the missing field to the schema\n - If this error should be ignored disable `extra-header` check in {validator}."
},
"missing-header": {
"name": "Missing Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is a missing header in column {column_number}",
"description": "Based on the schema there should be a header that is missing in the first row of the data source.\n\n How it could be resolved:\n - Add the missing column to the data source or remove the extra field from the schema\n - If this error should be ignored disable `missing-header` check in {validator}."
},
"type-or-format-error": {
"name": "Type or Format Error",
"type": "schema",
"context": "body",
"weight": 9,
"message": "The value {value} in row {row_number} and column {column_number} is not type {field_type} and format {field_format}",
"description": "The value does not match the schema type and format for this field.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If this value is correct, adjust the type and/or format.\n - To ignore the error, disable the `type-or-format-error` check in {validator}. In this case all schema checks for row values will be ignored."
},
"required-constraint": {
"name": "Required Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Column {column_number} is a required field, but row {row_number} has no value",
"description": "This field is a required field, but it contains no value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove the `required` constraint from the schema.\n - If this error should be ignored disable `required-constraint` check in {validator}."
},
"pattern-constraint": {
"name": "Pattern Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the pattern constraint of {constraint}",
"description": "This field value should conform to constraint pattern.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `pattern` constraint in the schema.\n - If this error should be ignored disable `pattern-constraint` check in {validator}."
},
"unique-constraint": {
"name": "Unique Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Rows {row_numbers} has unique constraint violation in column {column_number}",
"description": "This field is a unique field but it contains a value that has been used in another row.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then the values in this column are not unique. Remove the `unique` constraint from the schema.\n - If this error should be ignored disable `unique-constraint` check in {validator}."
},
"enumerable-constraint": {
"name": "Enumerable Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the given enumeration: {constraint}",
"description": "This field value should be equal to one of the values in the enumeration constraint.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `enum` constraint in the schema.\n - If this error should be ignored disable `enumerable-constraint` check in {validator}."
},
"minimum-constraint": {
"name": "Minimum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum constraint of {constraint}",
"description": "This field value should be greater or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimum` constraint in the schema.\n - If this error should be ignored disable `minimum-constraint` check in {validator}."
},
"maximum-constraint": {
"name": "Maximum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum constraint of {constraint}",
"description": "This field value should be less or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximum` constraint in the schema.\n - If this error should be ignored disable `maximum-constraint` check in {validator}."
},
"minimum-length-constraint": {
"name": "Minimum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum length constraint of {constraint}",
"description": "A lenght of this field value should be greater or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimumLength` constraint in the schema.\n - If this error should be ignored disable `minimum-length-constraint` check in {validator}."
},
"maximum-length-constraint": {
"name": "Maximum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum length constraint of {constraint}",
"description": "A lenght of this field value should be less or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximumLength` constraint in the schema.\n - If this error should be ignored disable `maximum-length-constraint` check in {validator}."
}
}
}

View File

@ -0,0 +1,16 @@
{
"image": "",
"license": "",
"last_updated": "",
"keywords": [],
"sources": [{
"web": "",
"name": "",
"title": ""
}],
"name": "",
"description": "",
"resources": [],
"title": "",
"version": ""
}

View File

@ -37,6 +37,7 @@ import java.io.IOException;
import java.io.InputStream;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.temporal.ChronoUnit;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
@ -45,7 +46,7 @@ import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.apache.commons.lang.exception.ExceptionUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.tools.tar.TarOutputStream;
import org.json.JSONArray;
import org.json.JSONException;
@ -55,6 +56,8 @@ import org.slf4j.LoggerFactory;
import com.google.refine.history.HistoryEntryManager;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.PreferenceStore;
import com.google.refine.preference.TopList;
@ -73,7 +76,6 @@ public abstract class ProjectManager {
// Don't spend more than this much time saving projects if doing a quick save
static protected final int QUICK_SAVE_MAX_TIME = 1000 * 30; // 30 secs
protected Map<Long, ProjectMetadata> _projectsMetadata;
protected Map<String, Integer> _projectsTags;// TagName, number of projects having that tag
protected PreferenceStore _preferenceStore;
@ -100,7 +102,7 @@ public abstract class ProjectManager {
static public ProjectManager singleton;
protected ProjectManager(){
protected ProjectManager() {
_projectsMetadata = new HashMap<Long, ProjectMetadata>();
_preferenceStore = new PreferenceStore();
_projects = new HashMap<Long, Project>();
@ -191,7 +193,7 @@ public abstract class ProjectManager {
} catch (Exception e) {
e.printStackTrace();
}
}//FIXME what should be the behaviour if metadata is null? i.e. not found
}
Project project = getProject(id);
if (project != null && metadata != null && metadata.getModified().isAfter(project.getLastSave())) {
@ -200,8 +202,7 @@ public abstract class ProjectManager {
} catch (Exception e) {
e.printStackTrace();
}
}//FIXME what should be the behaviour if project is null? i.e. not found or loaded.
//FIXME what should happen if the metadata is found, but not the project? or vice versa?
}
}
}
@ -212,7 +213,7 @@ public abstract class ProjectManager {
* @param projectId
* @throws Exception
*/
public abstract void saveMetadata(ProjectMetadata metadata, long projectId) throws Exception;
public abstract void saveMetadata(IMetadata metadata, long projectId) throws Exception;
/**
* Save project to the data store
@ -265,19 +266,19 @@ public abstract class ProjectManager {
Project project = _projects.get(id); // don't call getProject() as that will load the project.
if (project != null) {
LocalDateTime projectLastSaveTime = project.getLastSave();
boolean hasUnsavedChanges =
metadata.getModified().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() >= project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli();
!metadata.getModified().isBefore(projectLastSaveTime);
// We use >= instead of just > to avoid the case where a newly created project
// has the same modified and last save times, resulting in the project not getting
// saved at all.
if (hasUnsavedChanges) {
long msecsOverdue = startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() - project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli();
long msecsOverdue = ChronoUnit.MILLIS.between(projectLastSaveTime, startTimeOfSave);
records.add(new SaveRecord(project, msecsOverdue));
} else if (!project.getProcessManager().hasPending()
&& startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() - project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() > PROJECT_FLUSH_DELAY) {
&& ChronoUnit.MILLIS.between(projectLastSaveTime, startTimeOfSave) > PROJECT_FLUSH_DELAY) {
/*
* It's been a while since the project was last saved and it hasn't been
@ -308,12 +309,9 @@ public abstract class ProjectManager {
"Saving some modified projects ..."
);
for (int i = 0;
i < records.size() &&
(allModified || (LocalDateTime.now().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() -
startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() < QUICK_SAVE_MAX_TIME));
for (int i = 0;i < records.size() &&
(allModified || (ChronoUnit.MILLIS.between(startTimeOfSave, LocalDateTime.now()) < QUICK_SAVE_MAX_TIME));
i++) {
try {
saveProject(records.get(i).project);
} catch (Exception e) {
@ -351,7 +349,7 @@ public abstract class ProjectManager {
/**
* Gets the project metadata from memory
* Requires that the metadata has already been loaded from the data store
* Requires that the metadata has already been loaded from the data store.
* @param id
* @return
*/
@ -420,7 +418,7 @@ public abstract class ProjectManager {
userMetadataPreference = new JSONArray(userMeta);
} catch (JSONException e1) {
logger.warn("wrong definition of userMetadata format. Please use form [{\"name\": \"client name\", \"display\":true}, {\"name\": \"progress\", \"display\":false}]");
logger.error(ExceptionUtils.getFullStackTrace(e1));
logger.error(ExceptionUtils.getStackTrace(e1));
}
for (int index = 0; index < userMetadataPreference.length(); index++) {
@ -465,7 +463,7 @@ public abstract class ProjectManager {
JSONObject projectMetaJsonObj = jsonObjArray.getJSONObject(index);
projectMetaJsonObj.put("display", false);
} catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e));
logger.error(ExceptionUtils.getStackTrace(e));
}
}
}
@ -474,6 +472,7 @@ public abstract class ProjectManager {
* Gets all the project Metadata currently held in memory.
* @return
*/
public Map<Long, ProjectMetadata> getAllProjectMetadata() {
for(Project project : _projects.values()) {
mergeEmptyUserMetadata(project.getMetadata());
@ -491,6 +490,7 @@ public abstract class ProjectManager {
return _projectsTags;
}
/**
* Gets the required project from the data store
* If project does not already exist in memory, it is loaded from the data store
@ -596,8 +596,9 @@ public abstract class ProjectManager {
*
* @param ps
*/
static protected void preparePreferenceStore(PreferenceStore ps) {
public static void preparePreferenceStore(PreferenceStore ps) {
ps.put("scripting.expressions", new TopList(s_expressionHistoryMax));
ps.put("scripting.starred-expressions", new TopList(Integer.MAX_VALUE));
}
}

View File

@ -37,7 +37,7 @@ import java.util.Iterator;
import java.util.TreeSet;
import java.util.regex.Pattern;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
public class FingerprintKeyer extends Keyer {

View File

@ -52,11 +52,11 @@ import org.slf4j.LoggerFactory;
import com.google.refine.Jsonizable;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.browsing.Engine;
import com.google.refine.history.HistoryEntry;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.process.Process;
import com.google.refine.util.ParsingUtilities;
@ -194,7 +194,7 @@ public abstract class Command {
* @return
* @throws ServletException
*/
protected ProjectMetadata getProjectMetadata(HttpServletRequest request) throws ServletException {
protected ProjectMetadata getMetadata(HttpServletRequest request) throws ServletException {
if (request == null) {
throw new IllegalArgumentException("parameter 'request' should not be null");
}
@ -313,6 +313,19 @@ public abstract class Command {
w.close();
}
static protected void respondJSONObject(
HttpServletResponse response, JSONObject o)
throws IOException, JSONException {
response.setCharacterEncoding("UTF-8");
response.setHeader("Content-Type", "application/json");
response.setHeader("Cache-Control", "no-cache");
Writer w = response.getWriter();
w.append(o.toString());
w.flush();
w.close();
}
static protected void respondException(HttpServletResponse response, Exception e)
throws IOException, ServletException {

View File

@ -54,9 +54,7 @@ public class GetPreferenceCommand extends Command {
throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null;
PreferenceStore ps = project != null ?
project.getMetadata().getPreferenceStore() :
ProjectManager.singleton.getPreferenceStore();
PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
String prefName = request.getParameter("name");
Object pref = ps.get(prefName);

View File

@ -52,9 +52,7 @@ public class SetPreferenceCommand extends Command {
throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null;
PreferenceStore ps = project != null ?
project.getMetadata().getPreferenceStore() :
ProjectManager.singleton.getPreferenceStore();
PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
String prefName = request.getParameter("name");
String valueString = request.getParameter("value");

View File

@ -63,7 +63,7 @@ public class GetExpressionHistoryCommand extends Command {
try {
Project project = getProject(request);
List<String> localExpressions = toExpressionList(project.getMetadata().getPreferenceStore().get("scripting.expressions"));
List<String> localExpressions = toExpressionList(ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"));
localExpressions = localExpressions.size() > 20 ? localExpressions.subList(0, 20) : localExpressions;
List<String> globalExpressions = toExpressionList(ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"));

View File

@ -54,7 +54,7 @@ public class LogExpressionCommand extends Command {
Project project = getProject(request);
String expression = request.getParameter("expression");
((TopList) project.getMetadata().getPreferenceStore().get("scripting.expressions"))
((TopList) ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"))
.add(expression);
((TopList) ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"))

View File

@ -41,8 +41,8 @@ import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class DeleteProjectCommand extends Command {

View File

@ -0,0 +1,48 @@
package com.google.refine.commands.project;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.everit.json.schema.ValidationException;
import org.json.JSONException;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFactory;
import com.google.refine.model.medadata.MetadataFormat;
public class GetMetadataCommand extends Command {
@Override
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
try {
Project project;
MetadataFormat metadataFormat;
try {
project = getProject(request);
metadataFormat = MetadataFormat.valueOf(request.getParameter("metadataFormat"));
} catch (ServletException e) {
respond(response, "error", e.getLocalizedMessage());
return;
}
// for now, only the data package metadata is supported.
if (metadataFormat != MetadataFormat.DATAPACKAGE_METADATA) {
respond(response, "error", "metadata format is not supported");
return;
}
IMetadata metadata = MetadataFactory.buildDataPackageMetadata(project);
respondJSONObject(response, metadata.getJSON());
} catch (JSONException e) {
respondException(response, e);
} catch (ValidationException e) {
respondException(response, e);
}
}
}

View File

@ -51,9 +51,9 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.ParsingUtilities;
public class ImportProjectCommand extends Command {

View File

@ -0,0 +1,83 @@
package com.google.refine.commands.project;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.io.Writer;
import java.util.zip.GZIPOutputStream;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.apache.commons.io.IOUtils;
import org.apache.tools.tar.TarOutputStream;
import com.google.refine.ProjectManager;
import com.google.refine.browsing.Engine;
import com.google.refine.commands.Command;
import com.google.refine.exporters.CsvExporter;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.PackageExtension;
public class PackageProjectCommand extends Command {
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
ProjectManager.singleton.setBusy(true);
try {
// get the metadata
String metadata = request.getParameter("metadata");
InputStream in = IOUtils.toInputStream(metadata, "UTF-8");
Project project = getProject(request);
Engine engine = getEngine(request, project);
// ensure project get saved
DataPackageMetadata dpm = new DataPackageMetadata();
dpm.loadFromStream(in);
ProjectManager.singleton.ensureProjectSaved(project.id);
// export project
CsvExporter exporter = new CsvExporter();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
Writer outputStreamWriter = new OutputStreamWriter(baos);
exporter.export(project, null, engine, outputStreamWriter);
OutputStream os = response.getOutputStream();
try {
PackageExtension.saveZip(dpm.getPackage(), baos, os);
response.setHeader("Content-Type", "application/x-gzip");
} finally {
outputStreamWriter.close();
os.close();
}
} catch (Exception e) {
respondException(response, e);
} finally {
ProjectManager.singleton.setBusy(false);
}
}
protected void gzipTarToOutputStream(Project project, OutputStream os) throws IOException {
GZIPOutputStream gos = new GZIPOutputStream(os);
try {
tarToOutputStream(project, gos);
} finally {
gos.close();
}
}
protected void tarToOutputStream(Project project, OutputStream os) throws IOException {
TarOutputStream tos = new TarOutputStream(os);
try {
ProjectManager.singleton.exportProject(project.id, tos);
} finally {
tos.close();
}
}
}

View File

@ -39,8 +39,8 @@ import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class RenameProjectCommand extends Command {
@Override
@ -49,7 +49,7 @@ public class RenameProjectCommand extends Command {
try {
String name = request.getParameter("name");
ProjectMetadata pm = getProjectMetadata(request);
ProjectMetadata pm = getMetadata(request);
pm.setName(name);

View File

@ -9,15 +9,14 @@ import javax.servlet.http.HttpServletResponse;
import org.json.JSONException;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class SetProjectMetadataCommand extends Command {
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null;
String metaName = request.getParameter("name");
String valueString = request.getParameter("value");
@ -33,7 +32,7 @@ public class SetProjectMetadataCommand extends Command {
response.setCharacterEncoding("UTF-8");
response.setHeader("Content-Type", "application/json");
meta.setAnyField(metaName, valueString);
meta.setAnyStringField(metaName, valueString);
ProjectManager.singleton.saveMetadata(meta, project.id);
respond(response, "{ \"code\" : \"ok\" }");

View File

@ -37,9 +37,9 @@ import javax.servlet.http.HttpServletResponse;
import org.json.JSONException;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class SetProjectTagsCommand extends Command {
@Override

View File

@ -0,0 +1,42 @@
package com.google.refine.commands.project;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.json.JSONException;
import org.json.JSONObject;
import com.google.refine.ProjectManager;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.validator.ValidateOperation;
import com.google.refine.util.ParsingUtilities;
public class ValidateSchemaCommand extends Command {
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
ProjectManager.singleton.setBusy(true);
try {
Project project = getProject(request);
JSONObject optionObj = ParsingUtilities.evaluateJsonStringToObject(
request.getParameter("options"));
new ValidateOperation(project, optionObj).startProcess();
respond(response, "{ \"code\" : \"ok\" }");
} catch (JSONException e) {
respondException(response, e);
} catch (ServletException e) {
respond(response, "error", e.getLocalizedMessage());
return;
} finally {
ProjectManager.singleton.setBusy(false);
}
}
}

View File

@ -47,8 +47,8 @@ import org.json.JSONException;
import org.json.JSONWriter;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class GetAllProjectMetadataCommand extends Command {
@Override

View File

@ -44,7 +44,7 @@ import java.util.Map;
import java.util.Properties;
import java.util.TimeZone;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;

View File

@ -38,7 +38,7 @@ import java.io.Writer;
import java.util.List;
import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils;
import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONObject;
import com.google.refine.ProjectManager;
@ -103,7 +103,7 @@ public class HtmlTableExporter implements WriterExporter {
if (cellData.link != null) {
writer.write("<a href=\"");
// TODO: The escape below looks wrong, but is probably harmless in most cases
writer.write(StringEscapeUtils.escapeHtml(cellData.link));
writer.write(StringEscapeUtils.escapeHtml4(cellData.link));
writer.write("\">");
}
writer.write(StringEscapeUtils.escapeXml(cellData.text));

View File

@ -42,7 +42,7 @@ import java.util.GregorianCalendar;
import java.util.Locale;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.util.Calendar;
import java.util.Date;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.io.UnsupportedEncodingException;
import java.net.URLEncoder;
import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils;
import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONException;
import org.json.JSONWriter;
@ -65,13 +65,13 @@ public class Escape implements Function {
if (o2 instanceof String) {
String mode = ((String) o2).toLowerCase();
if ("html".equals(mode)) {
return StringEscapeUtils.escapeHtml(s);
return StringEscapeUtils.escapeHtml4(s);
} else if ("xml".equals(mode)) {
return StringEscapeUtils.escapeXml(s);
return StringEscapeUtils.escapeXml11(s);
} else if ("csv".equals(mode)) {
return StringEscapeUtils.escapeCsv(s);
} else if ("javascript".equals(mode)) {
return StringEscapeUtils.escapeJavaScript(s);
return StringEscapeUtils.escapeEcmaScript(s);
} else if ("url".equals(mode)) {
try {
return URLEncoder.encode(s,"UTF-8");

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -40,11 +40,11 @@ import org.json.JSONException;
import org.json.JSONWriter;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.EvalError;
import com.google.refine.grel.ControlFunctionRegistry;
import com.google.refine.grel.Function;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class Reinterpret implements Function {

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -36,7 +36,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import java.util.regex.Pattern;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties;
import org.apache.commons.lang.WordUtils;
import org.apache.commons.lang3.text.WordUtils;
import org.json.JSONException;
import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.io.UnsupportedEncodingException;
import java.net.URLDecoder;
import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils;
import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONException;
import org.json.JSONWriter;
@ -56,13 +56,13 @@ public class Unescape implements Function {
String s = (String) o1;
String mode = ((String) o2).toLowerCase();
if ("html".equals(mode)) {
return StringEscapeUtils.unescapeHtml(s);
return StringEscapeUtils.unescapeHtml4(s);
} else if ("xml".equals(mode)) {
return StringEscapeUtils.unescapeXml(s);
} else if ("csv".equals(mode)) {
return StringEscapeUtils.unescapeCsv(s);
} else if ("javascript".equals(mode)) {
return StringEscapeUtils.unescapeJavaScript(s);
return StringEscapeUtils.escapeEcmaScript(s);
} else if ("url".equals(mode)) {
try {
return URLDecoder.decode(s,"UTF-8");

View File

@ -33,7 +33,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.grel.controls;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
public class IsNumeric extends IsTest {
@Override

View File

@ -44,7 +44,7 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang.exception.ExceptionUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.poi.POIXMLDocument;
import org.apache.poi.POIXMLException;
import org.apache.poi.common.usermodel.Hyperlink;
@ -60,13 +60,13 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Cell;
import com.google.refine.model.Project;
import com.google.refine.model.Recon;
import com.google.refine.model.Recon.Judgment;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.ReconCandidate;
import com.google.refine.util.JSONUtilities;
@ -191,7 +191,7 @@ public class ExcelImporter extends TabularImportingParserBase {
// value is fileName#sheetIndex
fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#");
} catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e));
logger.error(ExceptionUtils.getStackTrace(e));
}
if (!fileNameAndSheetIndex[0].equals(fileSource))

View File

@ -14,10 +14,10 @@ import java.util.List;
import org.json.JSONArray;
import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class FixedWidthImporter extends TabularImportingParserBase {

View File

@ -44,7 +44,6 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingParser;
@ -52,6 +51,7 @@ import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Column;
import com.google.refine.model.ModelException;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
abstract public class ImportingParserBase implements ImportingParser {

View File

@ -49,7 +49,6 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.tree.ImportColumnGroup;
import com.google.refine.importers.tree.TreeImportingParserBase;
import com.google.refine.importers.tree.TreeReader;
@ -57,6 +56,7 @@ import com.google.refine.importers.tree.TreeReaderException;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class JsonImporter extends TreeImportingParserBase {

View File

@ -10,9 +10,9 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class LineBasedImporter extends TabularImportingParserBase {

View File

@ -44,7 +44,7 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang.exception.ExceptionUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
@ -55,13 +55,13 @@ import org.odftoolkit.odfdom.doc.table.OdfTableRow;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Cell;
import com.google.refine.model.Project;
import com.google.refine.model.Recon;
import com.google.refine.model.Recon.Judgment;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.ReconCandidate;
import com.google.refine.util.JSONUtilities;
@ -150,7 +150,7 @@ public class OdsImporter extends TabularImportingParserBase {
// value is fileName#sheetIndex
fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#");
} catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e));
logger.error(ExceptionUtils.getStackTrace(e));
}
if (!fileNameAndSheetIndex[0].equals(fileSource))

View File

@ -50,7 +50,6 @@ import org.jrdf.parser.RdfReader;
import org.jrdf.util.ClosableIterable;
import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.ExpressionUtils;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell;
@ -58,6 +57,7 @@ import com.google.refine.model.Column;
import com.google.refine.model.ModelException;
import com.google.refine.model.Project;
import com.google.refine.model.Row;
import com.google.refine.model.medadata.ProjectMetadata;
public class RdfTripleImporter extends ImportingParserBase {
private RdfReader rdfReader;

View File

@ -49,15 +49,15 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang.StringEscapeUtils;
import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONObject;
import au.com.bytecode.opencsv.CSVParser;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class SeparatorBasedImporter extends TabularImportingParserBase {

View File

@ -41,13 +41,13 @@ import java.util.List;
import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.ExpressionUtils;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell;
import com.google.refine.model.Column;
import com.google.refine.model.Project;
import com.google.refine.model.Row;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
abstract public class TabularImportingParserBase extends ImportingParserBase {

View File

@ -56,13 +56,13 @@ import org.sweble.wikitext.parser.preprocessor.PreprocessedWikitext;
import xtc.parser.ParseException;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell;
import com.google.refine.model.Column;
import com.google.refine.model.Project;
import com.google.refine.model.Recon;
import com.google.refine.model.ReconStats;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.recon.StandardReconConfig.ColumnDetail;
import com.google.refine.util.JSONUtilities;
import com.google.refine.model.recon.StandardReconConfig;

View File

@ -51,7 +51,6 @@ import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.tree.ImportColumnGroup;
import com.google.refine.importers.tree.TreeImportingParserBase;
import com.google.refine.importers.tree.TreeReader;
@ -59,6 +58,7 @@ import com.google.refine.importers.tree.TreeReaderException;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
public class XmlImporter extends TreeImportingParserBase {

View File

@ -3,7 +3,7 @@ package com.google.refine.importers.tree;
import java.util.LinkedHashMap;
import java.util.Map;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
/**
* A column group describes a branch in tree structured data

View File

@ -39,16 +39,16 @@ import java.io.InputStream;
import java.io.Reader;
import java.util.List;
import org.apache.commons.lang.NotImplementedException;
import org.apache.commons.lang3.NotImplementedException;
import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.ImporterUtilities;
import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress;
import com.google.refine.importers.ImportingParserBase;
import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
/**
@ -154,7 +154,7 @@ abstract public class TreeImportingParserBase extends ImportingParserBase {
JSONObject options,
List<Exception> exceptions
) {
throw new NotImplementedException();
throw new NotImplementedException("project ID:" + project.id);
}
/**

View File

@ -272,6 +272,14 @@ public class DefaultImportingController implements ImportingController {
}
}
/**
* return the job to the front end.
* @param request
* @param response
* @param job
* @throws ServletException
* @throws IOException
*/
private void replyWithJobData(HttpServletRequest request, HttpServletResponse response, ImportingJob job)
throws ServletException, IOException {

View File

@ -47,8 +47,8 @@ import org.json.JSONWriter;
import com.google.refine.Jsonizable;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities;
@ -139,6 +139,14 @@ public class ImportingJob implements Jsonizable {
}
}
/**
* TO check if the file record is a metadata file entry
* @param fileRecordObject
* @return JSONObject
*/
public boolean isMetadataFileRecord(JSONObject fileRecordObject) {
return fileRecordObject.has("metaDataFormat");
}
public List<JSONObject> getSelectedFileRecords() {
List<JSONObject> results = new ArrayList<JSONObject>();
@ -208,5 +216,4 @@ public class ImportingJob implements Jsonizable {
writer.endObject();
}
}
}

View File

@ -37,8 +37,8 @@ import java.util.List;
import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public interface ImportingParser {
/**

View File

@ -42,6 +42,7 @@ import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.io.UnsupportedEncodingException;
import java.net.URISyntaxException;
import java.net.URL;
import java.net.URLConnection;
import java.text.NumberFormat;
@ -49,9 +50,11 @@ import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.stream.Collectors;
import java.util.zip.GZIPInputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
@ -65,10 +68,14 @@ import org.apache.commons.fileupload.ProgressListener;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.fileupload.util.Streams;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.http.HttpEntity;
import org.apache.http.HttpResponse;
import org.apache.http.HttpStatus;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.DecompressingHttpClient;
import org.apache.http.impl.client.DefaultHttpClient;
@ -82,16 +89,35 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.importing.ImportingManager.Format;
import com.google.refine.importing.UrlRewriter.Result;
import com.google.refine.model.Column;
import com.google.refine.model.ColumnModel;
import com.google.refine.model.Project;
import com.google.refine.model.Row;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFactory;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.model.medadata.PackageExtension;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.PreferenceStore;
import com.google.refine.util.JSONUtilities;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.tableschema.Field;
import io.frictionlessdata.tableschema.Schema;
import io.frictionlessdata.tableschema.TypeInferrer;
import io.frictionlessdata.tableschema.exceptions.TypeInferringException;
public class ImportingUtilities {
final static protected Logger logger = LoggerFactory.getLogger("importing-utilities");
private final static String METADATA_FILE_KEY = "metadataFile";
private static final int INFER_ROW_LIMIT = 100;
static public interface Progress {
public void setProgress(String message, int percent);
public boolean isCanceled();
@ -172,11 +198,11 @@ public class ImportingUtilities {
) throws Exception {
JSONArray fileRecords = new JSONArray();
JSONUtilities.safePut(retrievalRecord, "files", fileRecords);
JSONUtilities.safePut(retrievalRecord, "downloadCount", 0);
JSONUtilities.safePut(retrievalRecord, "archiveCount", 0);
int clipboardCount = 0;
int uploadCount = 0;
int downloadCount = 0;
int archiveCount = 0;
// This tracks the total progress, which involves uploading data from the client
// as well as downloading data from URLs.
@ -220,7 +246,7 @@ public class ImportingUtilities {
List<FileItem> tempFiles = (List<FileItem>)upload.parseRequest(request);
progress.setProgress("Uploading data ...", -1);
parts: for (FileItem fileItem : tempFiles) {
for (FileItem fileItem : tempFiles) {
if (progress.isCanceled()) {
break;
}
@ -255,28 +281,116 @@ public class ImportingUtilities {
} else if (name.equals("download")) {
String urlString = Streams.asString(stream);
URL url = new URL(urlString);
download(rawDataDir, retrievalRecord, progress, fileRecords, update, urlString);
processDataPackage(retrievalRecord, fileRecords);
} else if (name.equals("data-package")) {
String urlString = Streams.asString(stream);
List<Result> results = null;
for (UrlRewriter rewriter : ImportingManager.urlRewriters) {
results = rewriter.rewrite(urlString);
if (results != null) {
for (Result result : results) {
download(rawDataDir, retrievalRecord, progress, fileRecords,
update, result.rewrittenUrl, result.metaDataFormat);
}
}
}
} else {
String value = Streams.asString(stream);
parameters.put(name, value);
// TODO: We really want to store this on the request so it's available for everyone
// request.getParameterMap().put(name, value);
}
} else { // is file content
String fileName = fileItem.getName();
if (fileName.length() > 0) {
long fileSize = fileItem.getSize();
File file = allocateFile(rawDataDir, fileName);
JSONObject fileRecord = new JSONObject();
JSONUtilities.safePut(fileRecord, "origin", "upload");
JSONUtilities.safePut(fileRecord, "declaredEncoding", request.getCharacterEncoding());
JSONUtilities.safePut(fileRecord, "declaredMimeType", fileItem.getContentType());
JSONUtilities.safePut(fileRecord, "fileName", fileName);
JSONUtilities.safePut(fileRecord, "location", getRelativePath(file, rawDataDir));
progress.setProgress(
"Saving file " + fileName + " locally (" + formatBytes(fileSize) + " bytes)",
calculateProgressPercent(update.totalExpectedSize, update.totalRetrievedSize));
JSONUtilities.safePut(fileRecord, "size", saveStreamToFile(stream, file, null));
if (postProcessRetrievedFile(rawDataDir, file, fileRecord, fileRecords, progress)) {
JSONUtilities.safeInc(retrievalRecord, "archiveCount");
}
processDataPackage(retrievalRecord, fileRecords);
uploadCount++;
}
}
stream.close();
}
// Delete all temp files.
for (FileItem fileItem : tempFiles) {
fileItem.delete();
}
JSONUtilities.safePut(retrievalRecord, "uploadCount", uploadCount);
JSONUtilities.safePut(retrievalRecord, "clipboardCount", clipboardCount);
}
private static void processDataPackage(JSONObject retrievalRecord, JSONArray fileRecords) {
int dataPackageJSONFileIndex = getDataPackageJSONFile(fileRecords);
if (dataPackageJSONFileIndex >= 0) {
JSONObject dataPackageJSONFile = (JSONObject) fileRecords.get(dataPackageJSONFileIndex);
JSONUtilities.safePut(dataPackageJSONFile, "metaDataFormat", MetadataFormat.DATAPACKAGE_METADATA.name());
JSONUtilities.safePut(retrievalRecord, METADATA_FILE_KEY, dataPackageJSONFile);
fileRecords.remove(dataPackageJSONFileIndex);
}
}
private static int getDataPackageJSONFile(JSONArray fileRecords) {
for (int i = 0; i < fileRecords.length(); i++) {
JSONObject file = fileRecords.getJSONObject(i);
if (file.has("archiveFileName") &&
file.has("fileName") &&
file.get("fileName").equals(DataPackageMetadata.DEFAULT_FILE_NAME)) {
return i;
}
}
return -1;
}
private static void download(File rawDataDir, JSONObject retrievalRecord, final Progress progress,
JSONArray fileRecords, final SavingUpdate update, String urlString)
throws URISyntaxException, IOException, ClientProtocolException, Exception {
download(rawDataDir, retrievalRecord, progress, fileRecords, update, urlString, null);
}
/**
* @param rawDataDir
* @param retrievalRecord
* @param progress
* @param fileRecords
* @param update
* @param urlString
* @throws URISyntaxException
* @throws IOException
* @throws ClientProtocolException
* @throws Exception
*/
private static void download(File rawDataDir, JSONObject retrievalRecord, final Progress progress,
JSONArray fileRecords, final SavingUpdate update, String urlString, String metaDataFormat)
throws URISyntaxException, IOException, ClientProtocolException, Exception {
URL url = new URL(urlString);
JSONObject fileRecord = new JSONObject();
JSONUtilities.safePut(fileRecord, "origin", "download");
JSONUtilities.safePut(fileRecord, "url", urlString);
for (UrlRewriter rewriter : ImportingManager.urlRewriters) {
Result result = rewriter.rewrite(urlString);
if (result != null) {
urlString = result.rewrittenUrl;
url = new URL(urlString);
JSONUtilities.safePut(fileRecord, "url", urlString);
JSONUtilities.safePut(fileRecord, "format", result.format);
if (!result.download) {
downloadCount++;
JSONUtilities.append(fileRecords, fileRecord);
continue parts;
}
}
}
if ("http".equals(url.getProtocol()) || "https".equals(url.getProtocol())) {
DefaultHttpClient client = new DefaultHttpClient();
DecompressingHttpClient httpclient =
@ -301,7 +415,12 @@ public class ImportingUtilities {
HttpResponse response = httpclient.execute(httpGet);
try {
response.getStatusLine();
int code = response.getStatusLine().getStatusCode();
if (code != HttpStatus.SC_OK) {
throw new Exception("HTTP response code: " + code +
" when accessing URL: "+ url.toString());
}
HttpEntity entity = response.getEntity();
if (entity == null) {
throw new Exception("No content found in " + url.toString());
@ -317,12 +436,20 @@ public class ImportingUtilities {
contentType = entity.getContentType().getValue();
}
JSONUtilities.safePut(fileRecord, "declaredMimeType", contentType);
if (saveStream(stream2, url, rawDataDir, progress, update,
fileRecord, fileRecords,
entity.getContentLength())) {
archiveCount++;
JSONUtilities.safeInc(retrievalRecord, "archiveCount");
}
downloadCount++;
if (metaDataFormat != null) {
JSONUtilities.safePut(fileRecord, "metaDataFormat", metaDataFormat);
JSONUtilities.safePut(retrievalRecord, METADATA_FILE_KEY, fileRecord);
fileRecords.remove(0);
}
JSONUtilities.safeInc(retrievalRecord, "downloadCount");
EntityUtils.consume(entity);
} finally {
httpGet.releaseConnection();
@ -341,60 +468,16 @@ public class ImportingUtilities {
if (saveStream(stream2, url, rawDataDir, progress,
update, fileRecord, fileRecords,
urlConnection.getContentLength())) {
archiveCount++;
JSONUtilities.safeInc(retrievalRecord, "archiveCount");
}
downloadCount++;
if (metaDataFormat != null)
JSONUtilities.safePut(fileRecord, "metaDataFormat", metaDataFormat);
JSONUtilities.safeInc(retrievalRecord, "downloadCount");
} finally {
stream2.close();
}
}
} else {
String value = Streams.asString(stream);
parameters.put(name, value);
// TODO: We really want to store this on the request so it's available for everyone
// request.getParameterMap().put(name, value);
}
} else { // is file content
String fileName = fileItem.getName();
if (fileName.length() > 0) {
long fileSize = fileItem.getSize();
File file = allocateFile(rawDataDir, fileName);
JSONObject fileRecord = new JSONObject();
JSONUtilities.safePut(fileRecord, "origin", "upload");
JSONUtilities.safePut(fileRecord, "declaredEncoding", request.getCharacterEncoding());
JSONUtilities.safePut(fileRecord, "declaredMimeType", fileItem.getContentType());
JSONUtilities.safePut(fileRecord, "fileName", fileName);
JSONUtilities.safePut(fileRecord, "location", getRelativePath(file, rawDataDir));
progress.setProgress(
"Saving file " + fileName + " locally (" + formatBytes(fileSize) + " bytes)",
calculateProgressPercent(update.totalExpectedSize, update.totalRetrievedSize));
JSONUtilities.safePut(fileRecord, "size", saveStreamToFile(stream, file, null));
if (postProcessRetrievedFile(rawDataDir, file, fileRecord, fileRecords, progress)) {
archiveCount++;
}
uploadCount++;
}
}
stream.close();
}
// Delete all temp files.
for (FileItem fileItem : tempFiles) {
fileItem.delete();
}
JSONUtilities.safePut(retrievalRecord, "uploadCount", uploadCount);
JSONUtilities.safePut(retrievalRecord, "downloadCount", downloadCount);
JSONUtilities.safePut(retrievalRecord, "clipboardCount", clipboardCount);
JSONUtilities.safePut(retrievalRecord, "archiveCount", archiveCount);
}
private static boolean saveStream(InputStream stream, URL url, File rawDataDir, final Progress progress,
@ -1021,8 +1104,45 @@ public class ImportingUtilities {
if (exceptions.size() == 0) {
project.update(); // update all internal models, indexes, caches, etc.
boolean hasMetadataFileRecord = ((JSONObject)job.getRetrievalRecord()).has(METADATA_FILE_KEY);
if (hasMetadataFileRecord) {
JSONObject metadataFileRecord = (JSONObject) job.getRetrievalRecord().get(METADATA_FILE_KEY);
String metadataFormat = (String)metadataFileRecord.get("metaDataFormat");
IMetadata metadata = MetadataFactory.buildMetadata(MetadataFormat.valueOf(metadataFormat));
String relativePath = metadataFileRecord.getString("location");
File metadataFile = new File(job.getRawDataDir(), relativePath);
metadata.loadFromFile(metadataFile);
// process the data package metadata
if (MetadataFormat.valueOf(metadataFormat) == MetadataFormat.DATAPACKAGE_METADATA) {
populateDataPackageMetadata(project, pm, (DataPackageMetadata) metadata);
}
logger.info(metadataFileRecord.get("metaDataFormat") + " metadata is set for project " + project.id);
}
ProjectManager.singleton.registerProject(project, pm);
// infer the column type
if (project.columnModel.columns.get(0).getType().isEmpty()) {
List<Object[]> listCells = new ArrayList<Object[]>(INFER_ROW_LIMIT);
List<Row> rows = project.rows
.stream()
.limit(INFER_ROW_LIMIT)
.collect(Collectors.toList());
rows.forEach(r->listCells.add(r.cells.toArray()));
try {
JSONObject fieldsJSON = TypeInferrer.getInstance().infer(listCells,
project.columnModel.getColumnNames().toArray(new String[0]),
100);
populateColumnTypes(project.columnModel, fieldsJSON.getJSONArray(Schema.JSON_KEY_FIELDS));
} catch (TypeInferringException e) {
logger.error("infer column type exception.", ExceptionUtils.getStackTrace(e));
}
}
job.setProjectID(project.id);
job.setState("created-project");
} else {
@ -1033,10 +1153,71 @@ public class ImportingUtilities {
}
}
private static void populateDataPackageMetadata(Project project, ProjectMetadata pmd, DataPackageMetadata metadata) {
// project metadata
JSONObject pkg = metadata.getPackage().getJson();
pmd.setName(getDataPackageProperty(pkg, Package.JSON_KEY_NAME));
pmd.setDescription(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_DESCRIPTION));
pmd.setTitle(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_TITLE));
pmd.setHomepage(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_HOMEPAGE));
pmd.setImage(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_IMAGE));
pmd.setLicense(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_LICENSE));
pmd.setVersion(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_VERSION));
if (pkg.has(PackageExtension.JSON_KEY_KEYWORKS)) {
String[] tags = pkg.getJSONArray(PackageExtension.JSON_KEY_KEYWORKS).toList().toArray(new String[0]);
pmd.appendTags(tags);
}
// column model
JSONObject schema = metadata.getPackage().getResources().get(0).getSchema();
if (schema != null) {
populateColumnTypes(project.columnModel, schema.getJSONArray(Schema.JSON_KEY_FIELDS));
}
}
private static String getDataPackageProperty(JSONObject pkg, String key) {
return JSONUtilities.getString(pkg, key, StringUtils.EMPTY);
}
/**
* Populate the column model
* @param columnModel
* @param fieldsJSON
*/
private static void populateColumnTypes(ColumnModel columnModel, JSONArray fieldsJSON) {
int cellIndex = 0;
Iterator<Object> iter = fieldsJSON.iterator();
while(iter.hasNext()){
JSONObject fieldJsonObj = (JSONObject)iter.next();
Field field = new Field(fieldJsonObj);
Column column = columnModel.getColumnByCellIndex(cellIndex);
column.setType(field.getType());
column.setFormat(field.getFormat());
column.setDescription(field.getDescription());
column.setTitle(field.getTitle());
column.setConstraints(field.getConstraints());
cellIndex++;
}
}
/**
* Create project metadata. pull the "USER_NAME" from the PreferenceStore as the creator
* @param optionObj
* @return
*/
static public ProjectMetadata createProjectMetadata(JSONObject optionObj) {
ProjectMetadata pm = new ProjectMetadata();
PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
pm.setName(JSONUtilities.getString(optionObj, "projectName", "Untitled"));
pm.setTags(JSONUtilities.getStringArray(optionObj, "projectTags"));
pm.setTitle(JSONUtilities.getString(optionObj, "title", ""));
pm.setHomepage(JSONUtilities.getString(optionObj, "homepage", ""));
pm.setImage(JSONUtilities.getString(optionObj, "image", ""));
pm.setLicense(JSONUtilities.getString(optionObj, "license", ""));
String encoding = JSONUtilities.getString(optionObj, "encoding", "UTF-8");
if ("".equals(encoding)) {
@ -1044,6 +1225,12 @@ public class ImportingUtilities {
encoding = "UTF-8";
}
pm.setEncoding(encoding);
if (ps.get(PreferenceStore.USER_NAME) != null) {
String creator = (String) ps.get(PreferenceStore.USER_NAME);
pm.setCreator(creator);
}
return pm;
}
}

View File

@ -33,12 +33,45 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.importing;
import java.io.IOException;
import java.net.MalformedURLException;
import java.util.List;
/**
* Given a URL rewrittenUrl, the interface will rewrite it into different URLS based on the rewrittenUrl
* The result will be stored in the Result and can be used for download, parsing etc.
* Typical use is to parse the data package json file.
* @see DataPackageUrlRewriter
*/
public interface UrlRewriter {
static public class Result {
public String rewrittenUrl;
public String format;
public boolean download;
public String metaDataFormat;
public Result(String rewrittenUrl, String format, boolean download) {
this.rewrittenUrl = rewrittenUrl;
this.format = format;
this.download = download;
}
public Result rewrite(String url);
public Result(String rewrittenUrl, String format, boolean download, String metaDataFormat) {
this.rewrittenUrl = rewrittenUrl;
this.format = format;
this.download = download;
this.metaDataFormat = metaDataFormat;
}
}
/**
* Parse the url and output the Result
* @param url
* @return
* @throws MalformedURLException
* @throws IOException
*/
public List<Result> rewrite(String url) throws MalformedURLException, IOException;
public boolean filter(String url);
}

View File

@ -57,9 +57,12 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.history.HistoryEntryManager;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.TopList;
@ -120,7 +123,6 @@ public class FileProjectManager extends ProjectManager {
if (metadata == null) {
metadata = ProjectMetadataUtilities.recover(getProjectDir(projectID), projectID);
}
if (metadata != null) {
_projectsMetadata.put(projectID, metadata);
if (_projectsTags == null) {
@ -231,9 +233,19 @@ public class FileProjectManager extends ProjectManager {
}
@Override
public void saveMetadata(ProjectMetadata metadata, long projectId) throws Exception {
public void saveMetadata(IMetadata metadata, long projectId) throws Exception {
File projectDir = getProjectDir(projectId);
if (metadata.getFormatName() == MetadataFormat.PROJECT_METADATA) {
Project project = ProjectManager.singleton.getProject(projectId);
((ProjectMetadata)metadata).setRowCount(project.rows.size());
ProjectMetadataUtilities.save(metadata, projectDir);
} else if (metadata.getFormatName() == MetadataFormat.DATAPACKAGE_METADATA) {
DataPackageMetadata dp = (DataPackageMetadata)metadata;
dp.writeToFile(new File(projectDir, DataPackageMetadata.DEFAULT_FILE_NAME));
}
logger.info("metadata saved in " + metadata.getFormatName());
}
@Override
@ -320,8 +332,6 @@ public class FileProjectManager extends ProjectManager {
return saveWasNeeded;
}
@Override
public void deleteProject(long projectID) {
synchronized (this) {
@ -363,8 +373,6 @@ public class FileProjectManager extends ProjectManager {
protected boolean loadFromFile(File file) {
logger.info("Loading workspace: {}", file.getAbsolutePath());
_projectsMetadata.clear();
boolean found = false;
if (file.exists() || file.canRead()) {

View File

@ -35,7 +35,6 @@ package com.google.refine.io;
import java.io.File;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.Writer;
@ -44,27 +43,25 @@ import java.time.LocalDateTime;
import java.time.ZoneId;
import java.util.List;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONTokener;
import org.json.JSONWriter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
public class ProjectMetadataUtilities {
final static Logger logger = LoggerFactory.getLogger("project_metadata_utilities");
public static void save(ProjectMetadata projectMeta, File projectDir) throws JSONException, IOException {
File tempFile = new File(projectDir, "metadata.temp.json");
public static void save(IMetadata projectMeta, File projectDir) throws JSONException, IOException {
File tempFile = new File(projectDir, ProjectMetadata.TEMP_FILE_NAME);
saveToFile(projectMeta, tempFile);
File file = new File(projectDir, "metadata.json");
File oldFile = new File(projectDir, "metadata.old.json");
File file = new File(projectDir, ProjectMetadata.DEFAULT_FILE_NAME);
File oldFile = new File(projectDir, ProjectMetadata.OLD_FILE_NAME);
if (oldFile.exists()) {
oldFile.delete();
@ -77,11 +74,15 @@ public class ProjectMetadataUtilities {
tempFile.renameTo(file);
}
protected static void saveToFile(ProjectMetadata projectMeta, File metadataFile) throws JSONException, IOException {
public static void saveTableSchema(Project project, File projectDir) throws JSONException, IOException {
}
protected static void saveToFile(IMetadata projectMeta, File metadataFile) throws JSONException, IOException {
Writer writer = new OutputStreamWriter(new FileOutputStream(metadataFile));
try {
JSONWriter jsonWriter = new JSONWriter(writer);
projectMeta.write(jsonWriter);
projectMeta.write(jsonWriter, false);
} finally {
writer.close();
}
@ -89,17 +90,17 @@ public class ProjectMetadataUtilities {
static public ProjectMetadata load(File projectDir) {
try {
return loadFromFile(new File(projectDir, "metadata.json"));
return loadFromFile(new File(projectDir, ProjectMetadata.DEFAULT_FILE_NAME));
} catch (Exception e) {
}
try {
return loadFromFile(new File(projectDir, "metadata.temp.json"));
return loadFromFile(new File(projectDir, ProjectMetadata.TEMP_FILE_NAME));
} catch (Exception e) {
}
try {
return loadFromFile(new File(projectDir, "metadata.old.json"));
return loadFromFile(new File(projectDir, ProjectMetadata.OLD_FILE_NAME));
} catch (Exception e) {
}
@ -148,14 +149,8 @@ public class ProjectMetadataUtilities {
}
static protected ProjectMetadata loadFromFile(File metadataFile) throws Exception {
FileReader reader = new FileReader(metadataFile);
try {
JSONTokener tokener = new JSONTokener(reader);
JSONObject obj = (JSONObject) tokener.nextValue();
return ProjectMetadata.loadFromJSON(obj);
} finally {
reader.close();
}
ProjectMetadata projectMetaData = new ProjectMetadata();
projectMetaData.loadFromFile(metadataFile);
return projectMetaData;
}
}

View File

@ -36,6 +36,8 @@ package com.google.refine.io;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.zip.ZipEntry;
import java.util.zip.ZipFile;
import java.util.zip.ZipOutputStream;
@ -45,6 +47,9 @@ import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.util.Pool;
@ -111,27 +116,9 @@ public class ProjectUtilities {
}
}
static public Project load(File dir, long id) {
static public Project loadDataFile(File dir, String dataFile, long id) {
try {
File file = new File(dir, "data.zip");
if (file.exists()) {
return loadFromFile(file, id);
}
} catch (Exception e) {
e.printStackTrace();
}
try {
File file = new File(dir, "data.temp.zip");
if (file.exists()) {
return loadFromFile(file, id);
}
} catch (Exception e) {
e.printStackTrace();
}
try {
File file = new File(dir, "data.old.zip");
File file = new File(dir, dataFile);
if (file.exists()) {
return loadFromFile(file, id);
}
@ -142,6 +129,49 @@ public class ProjectUtilities {
return null;
}
static public Project load(File dir, long id) {
Project project =null;
if ((project = loadDataFile(dir, "data.zip", id)) == null) {
if ((project = loadDataFile(dir, "data.temp.zip", id)) == null) {
project = loadDataFile(dir, "data.old.zip", id);
}
}
return project;
}
/**
* scan the folder for json files and read them as metadata
* @param dir
* @param project
*/
public static Map<MetadataFormat, IMetadata> retriveMetadata(File dir) {
// load the metadatas from data folder.
Map<MetadataFormat, IMetadata> metadataMap = new HashMap<MetadataFormat, IMetadata>();
File[] jsons = dir.listFiles(
(folder, file) -> {
return file.toLowerCase().endsWith(".json");
}
);
for (File file : jsons) {
// already loaded
if (file.getName().startsWith("metadata."))
continue;
DataPackageMetadata metadata = new DataPackageMetadata();
// load itself
metadata.loadFromFile(file);
metadataMap.put(MetadataFormat.DATAPACKAGE_METADATA, metadata);
}
return metadataMap;
}
static protected Project loadFromFile(
File file,
long id

View File

@ -34,10 +34,12 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.model;
import java.io.Writer;
import java.lang.reflect.Method;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
@ -45,8 +47,14 @@ import org.json.JSONWriter;
import com.google.refine.InterProjectModel;
import com.google.refine.Jsonizable;
import com.google.refine.model.recon.ReconConfig;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities;
import io.frictionlessdata.tableschema.Field;
import io.frictionlessdata.tableschema.TypeInferrer;
import io.frictionlessdata.tableschema.exceptions.ConstraintsException;
import io.frictionlessdata.tableschema.exceptions.InvalidCastException;
public class Column implements Jsonizable {
final private int _cellIndex;
final private String _originalName;
@ -54,6 +62,13 @@ public class Column implements Jsonizable {
private ReconConfig _reconConfig;
private ReconStats _reconStats;
// from data package metadata Field.java:
private String type = "";
private String format = Field.FIELD_FORMAT_DEFAULT;
private String title = "";
private String description = "";
private Map<String, Object> constraints = null;
transient protected Map<String, Object> _precomputes;
public Column(int cellIndex, String originalName) {
@ -101,6 +116,11 @@ public class Column implements Jsonizable {
writer.key("cellIndex"); writer.value(_cellIndex);
writer.key("originalName"); writer.value(_originalName);
writer.key("name"); writer.value(_name);
writer.key("type"); writer.value(type);
writer.key("format"); writer.value(format);
writer.key("title"); writer.value(title);
writer.key("description"); writer.value(description);
writer.key("constraints"); writer.value(new JSONObject(constraints).toString());
if (_reconConfig != null) {
writer.key("reconConfig");
_reconConfig.write(writer, options);
@ -140,6 +160,56 @@ public class Column implements Jsonizable {
_precomputes.put(key, value);
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String getFormat() {
return format;
}
public void setFormat(String format) {
this.format = format;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Map<String, Object> getConstraints() {
return constraints;
}
public void setConstraints(Map<String, Object> constraints) {
this.constraints = constraints;
}
public void save(Writer writer) {
JSONWriter jsonWriter = new JSONWriter(writer);
try {
@ -154,6 +224,14 @@ public class Column implements Jsonizable {
Column column = new Column(obj.getInt("cellIndex"), obj.getString("originalName"));
column._name = obj.getString("name");
column.type = JSONUtilities.getString(obj, Field.JSON_KEY_TYPE, StringUtils.EMPTY);
column.format = JSONUtilities.getString(obj, Field.JSON_KEY_FORMAT, StringUtils.EMPTY);
column.title = JSONUtilities.getString(obj, Field.JSON_KEY_TITLE, StringUtils.EMPTY);
column.description = JSONUtilities.getString(obj, Field.JSON_KEY_DESCRIPTION, StringUtils.EMPTY);
if (obj.has(Field.JSON_KEY_CONSTRAINTS)) {
column.constraints = new JSONObject(obj.getString(Field.JSON_KEY_CONSTRAINTS)).toMap();
}
if (obj.has("reconConfig")) {
column._reconConfig = ReconConfig.reconstruct(obj.getJSONObject("reconConfig"));
}
@ -168,4 +246,23 @@ public class Column implements Jsonizable {
public String toString() {
return _name;
}
public <Any> Any castValue(String value)
throws InvalidCastException, ConstraintsException {
if (this.type.isEmpty()) {
throw new InvalidCastException();
} else {
try {
// Using reflection to invoke appropriate type casting method from the
// TypeInferrer class
String castMethodName = "cast" + (this.type.substring(0, 1).toUpperCase() + this.type.substring(1));
Method method = TypeInferrer.class.getMethod(castMethodName, String.class, String.class, Map.class);
Object castValue = method.invoke(TypeInferrer.getInstance(), this.format, value, null);
return (Any) castValue;
} catch (Exception e) {
throw new InvalidCastException();
}
}
}
}

View File

@ -55,9 +55,9 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet;
import com.google.refine.history.History;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.process.ProcessManager;
import com.google.refine.util.ParsingUtilities;
import com.google.refine.util.Pool;
@ -83,8 +83,7 @@ public class Project {
}
public Project() {
id = generateID();
history = new History(this);
this(generateID());
}
protected Project(long id) {
@ -121,10 +120,6 @@ public class Project {
this._lastSave = LocalDateTime.now();
}
public ProjectMetadata getMetadata() {
return ProjectManager.singleton.getProjectMetadata(id);
}
public void saveToOutputStream(OutputStream out, Pool pool) throws IOException {
for (OverlayModel overlayModel : overlayModels.values()) {
try {
@ -259,10 +254,13 @@ public class Project {
recordModel.update(this);
}
//wrapper of processManager variable to allow unit testing
//TODO make the processManager variable private, and force all calls through this method
public ProcessManager getProcessManager() {
return this.processManager;
}
public ProjectMetadata getMetadata() {
return ProjectManager.singleton.getProjectMetadata(id);
}
}

View File

@ -62,6 +62,21 @@ public class ColumnAdditionChange extends ColumnChange {
newCells.toArray(_newCells);
}
public String getColumnName() {
return _columnName;
}
public int getColumnIndex() {
return _columnIndex;
}
public int getNewCellIndex() {
return _newCellIndex;
}
@Override
public void apply(Project project) {
synchronized (project) {

View File

@ -58,6 +58,18 @@ public class ColumnMoveChange extends ColumnChange {
_newColumnIndex = index;
}
public int getOldColumnIndex() {
return _oldColumnIndex;
}
public String getColumnName() {
return _columnName;
}
public int getNewColumnIndex() {
return _newColumnIndex;
}
@Override
public void apply(Project project) {
synchronized (project) {

View File

@ -59,6 +59,10 @@ public class ColumnRemovalChange extends ColumnChange {
_oldColumnIndex = index;
}
public int getOldColumnIndex() {
return _oldColumnIndex;
}
@Override
public void apply(Project project) {
synchronized (project) {

View File

@ -57,6 +57,11 @@ public class ColumnReorderChange extends ColumnChange {
_columnNames = columnNames;
}
public List<String> getColumnNames() {
return _columnNames;
}
@Override
public void apply(Project project) {
synchronized (project) {

View File

@ -54,7 +54,7 @@ import com.google.refine.model.Project;
import com.google.refine.model.Row;
import com.google.refine.util.Pool;
public class ColumnSplitChange implements Change {
public class ColumnSplitChange extends ColumnChange {
final protected String _columnName;
final protected List<String> _columnNames;
@ -118,6 +118,21 @@ public class ColumnSplitChange implements Change {
_newRows = newRows;
}
public List<String> getColumnNames() {
return _columnNames;
}
public boolean isRemoveOriginalColumn() {
return _removeOriginalColumn;
}
public int getColumnIndex() {
return _columnIndex;
}
@Override
public void apply(Project project) {
synchronized (project) {

View File

@ -0,0 +1,73 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.time.LocalDateTime;
import java.util.Properties;
import org.apache.commons.beanutils.PropertyUtils;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
public abstract class AbstractMetadata implements IMetadata {
private MetadataFormat formatName = MetadataFormat.UNKNOWN;
protected LocalDateTime written = null;
protected LocalDateTime _modified;
public MetadataFormat getFormatName() {
return formatName;
}
public void setFormatName(MetadataFormat formatName) {
this.formatName = formatName;
}
@Override
public abstract void loadFromJSON(JSONObject obj);
@Override
public abstract void loadFromFile(File metadataFile);
@Override
public abstract void writeToFile(File metadataFile);
@Override
public boolean isDirty() {
return written == null || _modified.isAfter(written);
}
@Override
public LocalDateTime getModified() {
return _modified;
}
@Override
public void updateModified() {
_modified = LocalDateTime.now();
}
/**
* @param jsonWriter
* writer to save metadatea to
* @param onlyIfDirty
* true to not write unchanged metadata
* @throws JSONException
*/
@Override
public void write(JSONWriter jsonWriter, boolean onlyIfDirty) throws JSONException {
if (!onlyIfDirty || isDirty()) {
Properties options = new Properties();
options.setProperty("mode", "save");
write(jsonWriter, options);
}
}
protected static boolean propertyExists(Object bean, String property) {
return PropertyUtils.isReadable(bean, property) &&
PropertyUtils.isWriteable(bean, property);
}
}

View File

@ -0,0 +1,130 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.everit.json.schema.ValidationException;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.datapackage.Resource;
import io.frictionlessdata.datapackage.exceptions.DataPackageException;
public class DataPackageMetadata extends AbstractMetadata {
private final static Logger logger = LoggerFactory.getLogger(DataPackageMetadata.class);
public static final String DEFAULT_FILE_NAME = "datapackage.json";
private Package _pkg;
public DataPackageMetadata() {
setFormatName(MetadataFormat.DATAPACKAGE_METADATA);
_pkg = PackageExtension.buildPackageFromTemplate();
}
@Override
public void loadFromJSON(JSONObject obj) {
try {
_pkg = new Package(obj);
} catch (ValidationException | DataPackageException | IOException e) {
logger.error("Load from JSONObject failed" + obj.toString(4),
ExceptionUtils.getStackTrace(e));
}
logger.info("Data Package metadata loaded");
}
@Override
public void loadFromFile(File metadataFile) {
String jsonString = null;
try {
jsonString = FileUtils.readFileToString(metadataFile);
} catch (IOException e) {
logger.error("Load data package failed when reading from file: " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
}
loadFromJSON(new JSONObject(jsonString));
}
/**
* Write the package to a json file.
*/
@Override
public void writeToFile(File metadataFile) {
try {
this._pkg.save(metadataFile.getAbsolutePath());
} catch (IOException e) {
logger.error("IO exception when writing to file " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
} catch (DataPackageException e) {
logger.error("Data package exception when writing to file " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
}
}
@Override
public void write(JSONWriter jsonWriter, Properties options)
throws JSONException {
StringWriter sw = new StringWriter();
_pkg.getJson().write(sw);
jsonWriter = new JSONWriter(sw);
}
@Override
public void loadFromStream(InputStream inputStream) {
try {
this._pkg = new Package(IOUtils.toString(inputStream));
} catch (ValidationException e) {
logger.error("validation failed", ExceptionUtils.getStackTrace(e));
} catch (DataPackageException e) {
logger.error("Data package excpetion when loading from stream", ExceptionUtils.getStackTrace(e));
} catch (IOException e) {
logger.error("IO exception when loading from stream", ExceptionUtils.getStackTrace(e));
}
}
public List<String> getResourcePaths() {
List<String> listResources = new ArrayList<String>();
for (Resource resource : _pkg.getResources()) {
listResources.add((String) resource.getPath());
}
return listResources;
}
@Override
public JSONObject getJSON() {
return _pkg.getJson();
}
public Package getPackage() {
return _pkg;
}
@Override
public List<Exception> validate() {
try {
_pkg.validate();
} catch (ValidationException | IOException | DataPackageException e) {
logger.error("validate json failed", ExceptionUtils.getStackTrace(e));
}
return _pkg.getErrors();
}
}

View File

@ -0,0 +1,44 @@
package com.google.refine.model.medadata;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.ArrayList;
import java.util.List;
import com.google.refine.importing.UrlRewriter;
public class DataPackageUrlRewriter implements UrlRewriter {
@Override
public List<Result> rewrite(String url) throws MalformedURLException, IOException {
List<Result> listResult = new ArrayList<Result>();
if (!filter(url))
return listResult;
listResult.add(new Result(url, "json", true, MetadataFormat.DATAPACKAGE_METADATA.name()));
DataPackageMetadata meta = new DataPackageMetadata();
meta.loadFromStream(new URL(url).openStream());
// Import the data files.
for (String path : meta.getResourcePaths()) {
String fileURL = getBaseURL(url) + "/" + path;
listResult.add(new Result(fileURL,
"", // leave to guesser. "text/line-based/*sv"
true));
}
return listResult;
}
@Override
public boolean filter(String url) {
return url.endsWith(DataPackageMetadata.DEFAULT_FILE_NAME);
}
private String getBaseURL(String url) {
return url.replaceFirst(DataPackageMetadata.DEFAULT_FILE_NAME, "");
}
}

View File

@ -0,0 +1,44 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.io.InputStream;
import java.time.LocalDateTime;
import java.util.List;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
import com.google.refine.Jsonizable;
/**
* Interface to import/export metadata
*/
public interface IMetadata extends Jsonizable {
public void loadFromJSON(JSONObject obj);
public void loadFromFile(File metadataFile);
public void loadFromStream(InputStream inputStream);
public void writeToFile(File metadataFile);
/**
* @param jsonWriter writer to save metadatea to
* @param onlyIfDirty true to not write unchanged metadata
* @throws JSONException
*/
public void write(JSONWriter jsonWriter, boolean onlyIfDirty);
public MetadataFormat getFormatName();
public void setFormatName(MetadataFormat format);
public LocalDateTime getModified();
public void updateModified();
public boolean isDirty();
public JSONObject getJSON();
public List<Exception> validate();
}

View File

@ -0,0 +1,82 @@
package com.google.refine.model.medadata;
import java.io.IOException;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.everit.json.schema.ValidationException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.model.Project;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.datapackage.Resource;
import io.frictionlessdata.datapackage.exceptions.DataPackageException;
public class MetadataFactory {
private final static Logger logger = LoggerFactory.getLogger(MetadataFactory.class);
/**
* Build metadata based on the format
* @param format
* @return
*/
public static IMetadata buildMetadata(MetadataFormat format) {
IMetadata metadata = null;
if (format == MetadataFormat.PROJECT_METADATA) {
metadata = new ProjectMetadata();
} else if (format == MetadataFormat.DATAPACKAGE_METADATA) {
metadata = new DataPackageMetadata();
}
return metadata;
}
/**
* build an empty Data Package Metadata
* @return
*/
public static DataPackageMetadata buildDataPackageMetadata() {
return (DataPackageMetadata) buildMetadata(MetadataFormat.DATAPACKAGE_METADATA);
}
/**
* Build an empty data package metadata, then populate the fields from the Project Metadata
* @param project
* @return
*/
public static DataPackageMetadata buildDataPackageMetadata(Project project) {
DataPackageMetadata dpm = buildDataPackageMetadata();
ProjectMetadata pmd = project.getMetadata();
Package pkg = dpm.getPackage();
Resource resource = SchemaExtension.createResource(project.getMetadata().getName(),
project.columnModel);
try {
pkg.addResource(resource);
putValue(pkg, Package.JSON_KEY_NAME, pmd.getName());
putValue(pkg, PackageExtension.JSON_KEY_LAST_UPDATED, ParsingUtilities.localDateToString(pmd.getModified()));
putValue(pkg, PackageExtension.JSON_KEY_DESCRIPTION, pmd.getDescription());
putValue(pkg, PackageExtension.JSON_KEY_TITLE, pmd.getTitle());
putValue(pkg, PackageExtension.JSON_KEY_HOMEPAGE, pmd.getHomepage());
putValue(pkg, PackageExtension.JSON_KEY_IMAGE, pmd.getImage());
putValue(pkg, PackageExtension.JSON_KEY_LICENSE, pmd.getLicense());
pkg.removeProperty(PackageExtension.JSON_KEY_KEYWORKS);
pkg.addProperty(PackageExtension.JSON_KEY_KEYWORKS, JSONUtilities.arrayToJSONArray(pmd.getTags()));
} catch (ValidationException | IOException | DataPackageException e) {
logger.error(ExceptionUtils.getStackTrace(e));
}
return dpm;
}
private static void putValue(Package pkg, String key, String value) throws DataPackageException {
if(pkg.getJson().has(key)) {
pkg.removeProperty(key);
}
pkg.addProperty(key, value);
}
}

View File

@ -0,0 +1,24 @@
package com.google.refine.model.medadata;
/**
* A list of supported metadata format
*
*/
public enum MetadataFormat {
UNKNOWN("UNKNOWN"),
PROJECT_METADATA("PROJECT_METADATA"),
DATAPACKAGE_METADATA("DATAPACKAGE_METADATA"),
CSVW_METADATA("CSVW_METADATA");
private final String format;
private MetadataFormat(final String format) {
this.format = format;
}
@Override
public String toString() {
return format;
}
}

View File

@ -0,0 +1,88 @@
package com.google.refine.model.medadata;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.OutputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import org.apache.commons.io.FileUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.commons.lang3.StringUtils;
import org.everit.json.schema.ValidationException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.datapackage.exceptions.DataPackageException;
/**
* This class contains some methods which is not included in the official "Data Package" repo for now.
* Some methods can be removed after the official library provide the corresponding function.
*/
public class PackageExtension {
private final static Logger logger = LoggerFactory.getLogger(PackageExtension.class);
private static final int JSON_INDENT_FACTOR = 4;
public static final String JSON_KEY_LAST_UPDATED = "last_updated";
public static final String JSON_KEY_DESCRIPTION = "description";
public static final String JSON_KEY_KEYWORKS = "keywords";
public static final String JSON_KEY_TITLE = "title";
public static final String JSON_KEY_HOMEPAGE = "homepage";
public static final String JSON_KEY_IMAGE = "image";
public static final String JSON_KEY_LICENSE = "license";
public static final String JSON_KEY_VERSION = "version";
public static String DATAPACKAGE_TEMPLATE_FILE = "schemas/datapackage-template.json";
/**
* Do the package since the final spec for the compression/bundle are not settled yet.
* https://github.com/frictionlessdata/datapackage-js/issues/93
*
* @param pkg Package
* @param dataByteArrayOutputStream ByteArrayOutputStream
* @param destOs OutputStream
* @throws IOException
* @throws FileNotFoundException
* @see Package#saveZip(String outputFilePath)
*/
public static void saveZip(Package pkg, final ByteArrayOutputStream dataByteArrayOutputStream, final OutputStream destOs) throws FileNotFoundException, IOException {
try(ZipOutputStream zos = new ZipOutputStream(destOs)){
// json file
ZipEntry entry = new ZipEntry(DataPackageMetadata.DEFAULT_FILE_NAME);
zos.putNextEntry(entry);
zos.write(pkg.getJson().toString(JSON_INDENT_FACTOR).getBytes());
zos.closeEntry();
// default data file to data.csv or given path(can only handle one file because files cannot be restored)
String path = (String) pkg.getResources().get(0).getPath();
entry = new ZipEntry(StringUtils.isBlank(path) ? "data.csv" : path);
zos.putNextEntry(entry);
zos.write(dataByteArrayOutputStream.toByteArray());
zos.closeEntry();
}
}
/**
* To build a Package object from a template file contains empty metadata
*
* @param templateFile
*/
public static Package buildPackageFromTemplate() {
try {
ClassLoader classLoader = PackageExtension.class.getClassLoader();
File file = new File(classLoader.getResource(DATAPACKAGE_TEMPLATE_FILE).getFile());
return new Package(FileUtils.readFileToString(file), false);
} catch (ValidationException e) {
logger.error("validation failed", ExceptionUtils.getStackTrace(e));
} catch (DataPackageException e) {
logger.error("DataPackage Exception", ExceptionUtils.getStackTrace(e));
} catch (IOException e) {
logger.error("IOException when build package from template", ExceptionUtils.getStackTrace(e));
}
return null;
}
}

Some files were not shown because too many files have changed in this diff Show More