data package metadata (#1398)

* fix the appbundle issue #1209

* fix #1162

allow the JRE 9

* fix the package declarations

* remove the _ from the method name

* use the explicit scoping

* remote extra ;

* fix issued from codacy

* fix issued from codacy

* add preferences link to the index page

* handle the empty user metadata

* fix 'last modified' sorting issue #1307

* prevent overflow of the table. issue #1306

* add isoDateParser to sort the date

* prevent overflow of the project index

* remove sorter arrow for action columns

* disable editing the internal metadata

* adjust the width of the table

* change MetaData to Metadata

* change the filed name from rowNumber to rowCount

* put back the incidently deleted gitignore

* add double quote to prevent word splitting

* UI improvement on metadata view and project list view

* remove the date field in metadata

* message notification of the free RAM. Issue #1295

* UI tuning for metadata view

* shorten the ISO date to locale date format

* Added translation using Weblate (Portuguese (Brazil))

* remove the rename link

* Ignore empty language files introduced by Weblate

* Add UI for Invert text filter

* Backend support for Inverting Text search facets

* Fix reset on text search facet

* More succinct return statements

* add tests for SetProjectMetadataCommand

* Tidying up for Codacy

* Added Tests for TextSearchFilter

* Corrections for Codacy

* More code tidy up

* let the browser auto fit the table cell when resizing/zooming

* fix import multiple excel with mulitple sheets issue #1328

* check if the project has the userMetadata

* fix the unit test
support multi files with multi tables for open office

* prevent the same key for user metadata

* replace _ with variable for exception

* fix the no-undef issue

* to adjust the width of transform dialog. issue #1332

* fix the row count refresh issue

* extract method

* move the log message

* cosmatic changes for codacy

* fix typo

* bump to version 2.8

* .gitignore is now working

* preview stage won't have the metadata populated, so protect NPE

* Update README.md

No more direct link to the last version tag, which will avoid having to think of updating the readme

* refacotring the ProjectMetadata class

* introduce the IMetadata interface

* create submodule of dataschema

* add back

* setup lib for dataschema; upgrade the apache lang to lang3

* replace escape* functions from apache lang3

* replace the ProjectMetadata with IMetadata interface

* add missing jars

* set the IMetadata a field of Project

* remove PreferenceStore out of Project model

* fix test SetProjectMetadataCommandTests by casting

* introdcue the AbstractMetadata

* introdcue the AbstractMetadata

* reorganize the metadata package

* allow have mulitiple metadata for a project

* support for mulitple metadata format

* remove jdk7 since 'table schema' java implmentation only support jdk8+

* set execute permission for script

* fix the Unit Test after Metadata refactoring

* restore the apache lang2.5 since jetty 6.1.22 depend on it

* add commons lang 2.5 jar

* git submodule add  https://github.com/frictionlessdata/datapackage-java

* remove the metadata parameter from the ProjectManager.registerProject method

* remove hashmap _projectsMetadata field from the ProjectManager and FileProjectManager

* init the Project.metadataMap

* fix Unit Test

* restore the ProjectMetaData map to ProjectManager

* put the ProjectMetaDta in place for ProjectManager and Project object

* check null of singleton instead of create a constructor just for test

* load the data package metadata

* importing data package

* importing data package

* encapsulate the Package class into DataPackageMetadata

* user _ to indicate the class fields

* introduce base URL in order to download the data files

* import data package UI and draft backend

* import data package UI

* fix typo

* download the data set pointed from metadata resource

* save and load the data package metadata

* avoid magic string

* package cleanup

* set the java_version to 1.8

* set the min jdk to 1.8

* add the 3rd party src in the build.xml

* skip the file selection page if only 1 DATA file

* add files structure for json editor

* seperate out the metadata file from the retrival file list

* rename the OKF_METADATA to DATAPACKAGE_METADATA

* clean up

* implement GetMetadateCommand class

* display the metadata in json format

* git submodule update --remote --merge

* adjust the setting after pulling from datapackage origin

* fix the failed UT DateExtensionTests.testFetchCounts due to new json jar json-20160810.jar will complain: JSONObject["float"] not a string.

* clean up the weird loop array syntax get complained

* remove the unused constant

* export in data package format

* interface cleanup

* fix UT

* edit the metadata

* add UT for SetMetadataCommand

* fix UT for SetMetadataCommand

* display the data package metadata link on the project index page

* update submodule

* log the exceptions

* Ajv does not work properly, use the back end validation instead

* enable the validation for jsoneditor

* first draft of the data validation

* create a map to hold the constraint and its handler

* rename

* support for minLength and maxLength from spec

* add validate command

* test the opeation instead of validate command

* rename the UT

* format the error message and push to the report

* fix row number

* add resource bundle for validator

* inject the code of the constrains

* make the StrSubstitutor works

* extract the type and format information

* add the customizedFormat to interface to allow format properly

* get rid of magic string

* take care of missing parts of the data package

* implement RequiredConstraint

* patch for number type

* add max/min constraints

* get the constrains directly from field

* implement the PatternConstraint

* suppress warning

* fix the broken UT when expecting 2 digits fraction

* handle the cast and type properly

* fix the missing resource files for data package when run from command line

* use the copy instead of copydir

* add script for appveyor

* update script for appveyor

* do recursive clone

* correct the git url

* fix clone path

* clone folder option does not work

* will put another PR for this. delete for now

* revert the interface method name

* lazy loading the project data

* disable the validate menu for now

* add UT

* assert UTs

* add UT

* fix #1386

* remove import

* test the thread

* Revert "test the thread"

This reverts commit 779214160055afe3ccdcc18c57b0c7c72e87c824.

* fix the URLCachingTest UT

* define the template data package

* tidy up the metadata interface

* check the http response code

* fix the package

* display user friendly message when URL path is not reachable

* populate the data package schema

* Delete hs_err_pid15194.log

* populate data package info

* add username  preference and it will be pulled as the creator of the metadata

* undo the project.updateColumnChange() and start to introduce the fields into the existing core model

* tightly integrate the data package metadata

* tightly integrate the data package metadata for project level

* remove the submodule

* move the edit botton

* clean up build

* load the new property

* load the project metadata

* fix issues from codacy

* remove unused fields and annotation

* check the http response code firstly

* import zipped data package

* allow without keywords

* process the zip data package from url

* merge the tags

* check store firstly

* remove the table schema src

* move the json schema files to schema dir

* add comment

* add comment

* remove git moduels

* add incidently deleted file

* fix typo

* remove SetMetadataCommand

* revert change

* merge from master
This commit is contained in:
Jacky 2018-02-02 08:24:19 -05:00 committed by Antonin Delpeuch
parent cd58557424
commit c4b0ff6bea
222 changed files with 56544 additions and 1240 deletions

View File

@ -1,6 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<classpath> <classpath>
<classpathentry kind="src" path="main/src"/> <classpathentry kind="src" path="main/src"/>
<classpathentry kind="src" path="main/resources"/>
<classpathentry kind="src" path="extensions/jython/tests/src"/> <classpathentry kind="src" path="extensions/jython/tests/src"/>
<classpathentry kind="src" path="server/src"/> <classpathentry kind="src" path="server/src"/>
<classpathentry kind="src" path="extensions/gdata/src"/> <classpathentry kind="src" path="extensions/gdata/src"/>
@ -14,15 +15,9 @@
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/arithcode-1.1.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/arithcode-1.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/butterfly-1.0.1.jar" sourcepath="main/webapp/WEB-INF/lib-src/butterfly-1.0.1-sources.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/butterfly-1.0.1.jar" sourcepath="main/webapp/WEB-INF/lib-src/butterfly-1.0.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/clojure-1.5.1-slim.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/clojure-1.5.1-slim.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-codec-1.6.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-io-1.4.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-lang-2.5.jar" sourcepath="main/webapp/WEB-INF/lib-src/commons-lang-2.5-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/dom4j-1.6.1.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/dom4j-1.6.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jrdf-0.5.6.jar" sourcepath="main/webapp/WEB-INF/lib-src/jrdf-0.5.6-sources.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jrdf-0.5.6.jar" sourcepath="main/webapp/WEB-INF/lib-src/jrdf-0.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="main/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/lessen-trunk-r8.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/log4j-1.2.15.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/log4j-1.2.15.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/marc4j-2.4.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/marc4j-2.4.jar"/>
@ -46,8 +41,6 @@
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-core-1.0.jar"/> <classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-core-1.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-spreadsheet-3.0.jar"/> <classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-spreadsheet-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jsoup-1.4.1.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/jsoup-1.4.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/tests/server/lib/mockito-all-1.9.5.jar"/>
<classpathentry exported="true" kind="lib" path="main/tests/server/lib/testng-6.8.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-3.0.jar"/> <classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-meta-3.0.jar"/> <classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-docs-meta-3.0.jar"/>
<classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-media-1.0.jar"/> <classpathentry exported="true" kind="lib" path="extensions/gdata/module/MOD-INF/lib/gdata-media-1.0.jar"/>
@ -67,7 +60,6 @@
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/signpost-core-1.2.1.2.jar" sourcepath="main/webapp/WEB-INF/lib-src/signpost-core-1.2.1.2-sources.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/signpost-core-1.2.1.2.jar" sourcepath="main/webapp/WEB-INF/lib-src/signpost-core-1.2.1.2-sources.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/guava-13.0.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/guava-13.0.jar"/>
<classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/jsr305-1.3.9.jar"/> <classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/jsr305-1.3.9.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/commons-logging-1.1.1.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/fluent-hc-4.2.5.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/fluent-hc-4.2.5.jar"/>
<classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/httpmime-4.2.5.jar"/> <classpathentry exported="true" kind="lib" path="main/webapp/WEB-INF/lib/httpmime-4.2.5.jar"/>
<classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/commons-logging-1.1.1.jar"/> <classpathentry kind="lib" path="extensions/gdata/module/MOD-INF/lib/commons-logging-1.1.1.jar"/>
@ -88,14 +80,44 @@
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/swc-parser-lazy-3.1.5-jar-with-dependencies.jar" sourcepath="main/webapp/WEB-INF/lib-src/swc-parser-lazy-3.1.5-sources.jar"/> <classpathentry kind="lib" path="main/webapp/WEB-INF/lib/swc-parser-lazy-3.1.5-jar-with-dependencies.jar" sourcepath="main/webapp/WEB-INF/lib-src/swc-parser-lazy-3.1.5-sources.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-annotations-2.9.1.jar"/> <classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-annotations-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-core-2.9.1.jar"/> <classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-core-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-databind-2.9.1.jar"/> <classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jackson-databind-2.9.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/json-20160810.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-beanutils-1.9.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-collections-3.2.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-digester-1.8.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-lang3-3.6.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-logging-1.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-text-1.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-validator-1.5.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/hamcrest-all-1.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/icu4j-4.2.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/joda-time-2.9.9.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/opencsv-4.0.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/org.everit.json.schema-1.5.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-lang-2.5.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/mysql-connector-java-5.1.44-bin.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/mysql-connector-java-5.1.44-bin.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/json-simple-1.1.1.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/json-simple-1.1.1.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-mapper-asl-1.9.13.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-mapper-asl-1.9.13.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/postgresql-42.1.4.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/postgresql-42.1.4.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/mariadb-java-client-2.2.0.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/mariadb-java-client-2.2.0.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-core-asl-1.9.13.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jackson-core-asl-1.9.13.jar"/>
<classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jasypt-1.9.2.jar"/> <classpathentry kind="lib" path="extensions/database/module/MOD-INF/lib/jasypt-1.9.2.jar"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/> <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/commons-csv-1.5.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/junit-4.12.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/bsh-2.0b4.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/byte-buddy-1.6.14.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/byte-buddy-agent-1.6.14.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/cglib-nodep-2.2.2.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/hamcrest-core-1.3.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/javassist-3.21.0-GA.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/jcommander-1.48.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/mockito-core-2.8.9.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/objenesis-2.5.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/powermock-mockito2-1.7.1-full.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/testng-6.9.10.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/datapackage-java-1.0-SNAPSHOT.jar"/>
<classpathentry kind="lib" path="main/webapp/WEB-INF/lib/tableschema-java-1.0-SNAPSHOT.jar"/>
<classpathentry kind="output" path="main/webapp/WEB-INF/classes"/> <classpathentry kind="output" path="main/webapp/WEB-INF/classes"/>
</classpath> </classpath>

View File

@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<classpath> <classpath>
<classpathentry excluding="build/**|main/webapp/modules/core/MOD-INF/controller.js|main/webapp/modules/core/externals/|test-output/" kind="src" path=""/> <classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/> <classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="output" path=""/> <classpathentry kind="output" path=""/>
</classpath> </classpath>

View File

@ -57,6 +57,7 @@ org.eclipse.wst.jsdt.core.compiler.problem.unusedParameterIncludeDocCommentRefer
org.eclipse.wst.jsdt.core.compiler.problem.unusedParameterWhenImplementingAbstract=disabled org.eclipse.wst.jsdt.core.compiler.problem.unusedParameterWhenImplementingAbstract=disabled
org.eclipse.wst.jsdt.core.compiler.problem.unusedPrivateMember=warning org.eclipse.wst.jsdt.core.compiler.problem.unusedPrivateMember=warning
org.eclipse.wst.jsdt.core.compiler.source=1.3 org.eclipse.wst.jsdt.core.compiler.source=1.3
org.eclipse.wst.jsdt.core.compiler.source.type=script
org.eclipse.wst.jsdt.core.compiler.taskCaseSensitive=enabled org.eclipse.wst.jsdt.core.compiler.taskCaseSensitive=enabled
org.eclipse.wst.jsdt.core.compiler.taskPriorities=NORMAL,HIGH,NORMAL org.eclipse.wst.jsdt.core.compiler.taskPriorities=NORMAL,HIGH,NORMAL
org.eclipse.wst.jsdt.core.compiler.taskTags=TODO,FIXME,XXX org.eclipse.wst.jsdt.core.compiler.taskTags=TODO,FIXME,XXX
@ -318,4 +319,5 @@ org.eclipse.wst.jsdt.core.formatter.tabulation.char=space
org.eclipse.wst.jsdt.core.formatter.tabulation.size=4 org.eclipse.wst.jsdt.core.formatter.tabulation.size=4
org.eclipse.wst.jsdt.core.formatter.use_tabs_only_for_leading_indentations=false org.eclipse.wst.jsdt.core.formatter.use_tabs_only_for_leading_indentations=false
org.eclipse.wst.jsdt.core.formatter.wrap_before_binary_operator=true org.eclipse.wst.jsdt.core.formatter.wrap_before_binary_operator=true
semanticValidation=enabled semanticValidation=disabled
strictOnKeywordUsage=disabled

View File

@ -0,0 +1,8 @@
DELEGATES_PREFERENCE=delegateValidatorList
USER_BUILD_PREFERENCE=enabledBuildValidatorList
USER_MANUAL_PREFERENCE=enabledManualValidatorList
USER_PREFERENCE=overrideGlobalPreferencestruedisableAllValidationtrueversion1.2.700.v201508251749
eclipse.preferences.version=1
override=true
suspend=true
vf.version=3

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="com.google.appengine.eclipse.core.GAE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-trunk.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/butterfly-trunk.jar"/>
<classpathentry kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpclient-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpclient-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpcore-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpcore-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-api-1.5.6-sources.jar"/>
<classpathentry combineaccessrules="false" kind="src" path="/grefine-broker"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,43 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-appengine-broker</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/com.google.gdt.eclipse.core.webAppProjectValidator.launch</value>
</dictionary>
</arguments>
</buildCommand>
<buildCommand>
<name>com.google.appengine.eclipse.core.enhancerbuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.ui.externaltools.ExternalToolBuilder</name>
<triggers>full,incremental,</triggers>
<arguments>
<dictionary>
<key>LaunchConfigHandle</key>
<value>&lt;project&gt;/.externalToolBuilders/com.google.appengine.eclipse.core.projectValidator.launch</value>
</dictionary>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>com.google.appengine.eclipse.core.gaeNature</nature>
</natures>
</projectDescription>

View File

@ -1,23 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="src" path="tests/src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-trunk.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/butterfly-trunk.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/json-20100208.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/bdb-je-4.0.103.jar" sourcepath="module/MOD-INF/lib-src/bdb-je-4.0.103-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpclient-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpclient-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/httpcore-4.0.1.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/httpcore-4.0.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-api-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/tests/server/lib/mockito-all-1.8.4.jar" sourcepath="/grefine/tests/server/lib-src/mockito-all-1.8.4-sources.jar"/>
<classpathentry kind="lib" path="/grefine/tests/server/lib/testng-5.12.1.jar" sourcepath="/grefine/tests/server/lib-src/testng-5.12.1-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/log4j-1.2.15.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/log4j-1.2.15-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/slf4j-log4j12-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/slf4j-log4j12-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar" sourcepath="/grefine/webapp/WEB-INF/lib-src/jcl-over-slf4j-1.5.6-sources.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-broker</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -140,6 +140,10 @@
<classpath refid="webapp.class.path" /> <classpath refid="webapp.class.path" />
</javac> </javac>
<copy file="${webapp.src.dir}/log4j.properties" tofile="${webapp.classes.dir}/log4j.properties"/> <copy file="${webapp.src.dir}/log4j.properties" tofile="${webapp.classes.dir}/log4j.properties"/>
<copy file="${main.dir}/resources/schemas/datapackage-template.json" tofile="${webapp.classes.dir}/schemas/datapackage-template.json"/>
<copy file="${main.dir}/resources/schemas/TableSchemaValidator.json" tofile="${webapp.classes.dir}/schemas/TableSchemaValidator.json"/>
<copy file="${webapp.src.dir}/validator-resource-bundle.properties" tofile="${webapp.classes.dir}/validator-resource-bundle.properties"/>
<copy file="${webapp.src.dir}/log4j.properties" tofile="${webapp.classes.dir}/log4j.properties"/>
</target> </target>
<target name="build_tests" depends="build"> <target name="build_tests" depends="build">

View File

@ -46,7 +46,6 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.commands.HttpUtilities; import com.google.refine.commands.HttpUtilities;
import com.google.refine.extension.database.model.DatabaseColumn; import com.google.refine.extension.database.model.DatabaseColumn;
@ -56,6 +55,7 @@ import com.google.refine.importing.ImportingController;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingManager; import com.google.refine.importing.ImportingManager;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;

View File

@ -25,7 +25,7 @@ import org.testng.annotations.Parameters;
import org.testng.annotations.Test; import org.testng.annotations.Test;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata; import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.extension.database.mysql.MySQLDatabaseService; import com.google.refine.extension.database.mysql.MySQLDatabaseService;
import com.google.refine.extension.database.stub.RefineDbServletStub; import com.google.refine.extension.database.stub.RefineDbServletStub;

View File

@ -25,7 +25,7 @@ import org.testng.annotations.Parameters;
import org.testng.annotations.Test; import org.testng.annotations.Test;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata; import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.extension.database.DBExtensionTestUtils; import com.google.refine.extension.database.DBExtensionTestUtils;
import com.google.refine.extension.database.DBExtensionTests; import com.google.refine.extension.database.DBExtensionTests;

View File

@ -1,32 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7">
<attributes>
<attribute name="owner.project.facets" value="java"/>
</attributes>
</classpathentry>
<classpathentry combineaccessrules="false" kind="src" path="/grefine"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-core-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-spreadsheet-3.0.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/butterfly-1.0.1.jar"/>
<classpathentry kind="lib" path="/grefine/webapp/WEB-INF/lib/jackson-core-asl-1.9.12.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-base-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-client-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-client-meta-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-docs-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-docs-meta-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-media-1.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/gdata-spreadsheet-meta-3.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/jsr305-1.3.9.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/mail.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-client-servlet-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-services-drive-v2-rev168-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-api-services-fusiontables-v2-rev3-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-http-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-http-client-jackson2-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-oauth-client-1.20.0.jar"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/google-oauth-client-servlet-1.20.0.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,31 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-gdata-extension</name>
<comment></comment>
<projects>
<project>gridworks</project>
<project>gridworks-server</project>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -1 +0,0 @@
/classes/

View File

@ -39,11 +39,11 @@ import com.google.api.services.fusiontables.model.Column;
import com.google.api.services.fusiontables.model.Sqlresponse; import com.google.api.services.fusiontables.model.Sqlresponse;
import com.google.api.services.fusiontables.model.Table; import com.google.api.services.fusiontables.model.Table;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase; import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importers.TabularImportingParserBase.TableDataReader; import com.google.refine.importers.TabularImportingParserBase.TableDataReader;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
/** /**

View File

@ -45,11 +45,11 @@ import com.google.gdata.data.spreadsheet.SpreadsheetEntry;
import com.google.gdata.data.spreadsheet.WorksheetEntry; import com.google.gdata.data.spreadsheet.WorksheetEntry;
import com.google.gdata.util.ServiceException; import com.google.gdata.util.ServiceException;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase; import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importers.TabularImportingParserBase.TableDataReader; import com.google.refine.importers.TabularImportingParserBase.TableDataReader;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
/** /**

View File

@ -65,7 +65,6 @@ import com.google.gdata.util.AuthenticationException;
import com.google.gdata.util.ServiceException; import com.google.gdata.util.ServiceException;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.commands.HttpUtilities; import com.google.refine.commands.HttpUtilities;
import com.google.refine.importing.DefaultImportingController; import com.google.refine.importing.DefaultImportingController;
@ -73,6 +72,7 @@ import com.google.refine.importing.ImportingController;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingManager; import com.google.refine.importing.ImportingManager;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;

View File

@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry combineaccessrules="false" kind="src" path="/grefine"/>
<classpathentry kind="lib" path="module/MOD-INF/lib/jython-standalone-2.7.1.jar"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-jython</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
<classpathentry combineaccessrules="false" kind="src" path="/OpenRefine"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,17 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>refine-pd-extension</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>

View File

@ -1 +0,0 @@
/classes/

View File

@ -41,10 +41,10 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.TabularImportingParserBase; import com.google.refine.importers.TabularImportingParserBase;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class PCAxisImporter extends TabularImportingParserBase { public class PCAxisImporter extends TabularImportingParserBase {

View File

@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.6"/>
<classpathentry kind="output" path="module/MOD-INF/classes"/>
</classpath>

View File

@ -1,29 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine-sample-extension</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -1 +0,0 @@
/classes/

View File

@ -1,46 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path="src"/>
<classpathentry kind="src" path="tests/server/src"/>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/butterfly-1.0.1.jar" sourcepath="webapp/WEB-INF/lib-src/butterfly-1.0.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-codec-1.6.jar" sourcepath="webapp/WEB-INF/lib-src/commons-codec-1.6-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-lang-2.5.jar" sourcepath="webapp/WEB-INF/lib-src/commons-lang-2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-fileupload-1.2.1.jar" sourcepath="webapp/WEB-INF/lib-src/commons-fileupload-1.2.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/json-20100208.jar" sourcepath="webapp/WEB-INF/lib-src/json-20100208-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/icu4j-4.2.1.jar" sourcepath="webapp/WEB-INF/lib-src/icu4j-4.2.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/arithcode-1.1.jar" sourcepath="webapp/WEB-INF/lib-src/arithcode-1.1-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/secondstring-20100303.jar" sourcepath="webapp/WEB-INF/lib-src/secondstring-20100303-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/ant-tools-1.8.0.jar" sourcepath="webapp/WEB-INF/lib-src/ant-tools-1.8.0-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/vicino-1.1.jar" sourcepath="webapp/WEB-INF/lib-src/vicino-1.1-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/opencsv-2.4-SNAPSHOT.jar" sourcepath="tests/java/lib-src/opencsv-2.4-SNAPSHOT-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jcl-over-slf4j-1.5.6.jar" sourcepath="webapp/WEB-INF/lib-src/jcl-over-slf4j-1.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/slf4j-api-1.5.6.jar" sourcepath="webapp/WEB-INF/lib/slf4j-api-1.5.6.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/slf4j-log4j12-1.5.6.jar" sourcepath="webapp/WEB-INF/lib-src/slf4j-log4j12-1.5.6-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/log4j-1.2.15.jar" sourcepath="webapp/WEB-INF/lib-src/log4j-1.2.15-sources.jar"/>
<classpathentry exported="true" kind="lib" path="webapp/WEB-INF/lib/dom4j-1.6.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/xmlbeans-2.3.0.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/clojure-1.5.1-slim.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jackson-core-asl-1.9.12.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/marc4j-2.4.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jrdf-0.5.6.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-collections-3.2.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/commons-io-1.4.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/rhino-1.7R2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/velocity-1.5.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/lessen-trunk-r8.jar"/>
<classpathentry kind="lib" path="tests/server/lib/mockito-all-1.9.5.jar" sourcepath="tests/server/lib-src/mockito-all-1.9.5-sources.jar"/>
<classpathentry kind="lib" path="tests/server/lib/testng-6.8.jar" sourcepath="tests/server/lib-src/testng-6.8-sources.jar"/>
<classpathentry exported="true" kind="lib" path="/grefine-server/lib/servlet-api-2.5.jar" sourcepath="/grefine-server/lib-src/servlet-api-2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/httpclient-4.2.5.jar" sourcepath="webapp/WEB-INF/lib-src/httpclient-4.2.5-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/httpcore-4.2.4.jar" sourcepath="webapp/WEB-INF/lib-src/httpcore-4.2.4-sources.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/signpost-commonshttp4-1.2.1.2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/signpost-core-1.2.1.2.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/jsoup-1.4.1.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/odfdom-java-0.8.7.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/guava-13.0.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-3.13-20150929.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-ooxml-3.13-20150929.jar"/>
<classpathentry kind="lib" path="webapp/WEB-INF/lib/poi-ooxml-schemas-3.13-20150929.jar"/>
<classpathentry kind="output" path="webapp/WEB-INF/classes"/>
</classpath>

1
main/.gitignore vendored
View File

@ -1 +0,0 @@
/test-output

View File

@ -1,29 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>grefine</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.wst.jsdt.core.javascriptValidator</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.wst.common.project.facet.core.builder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.wst.common.project.facet.core.nature</nature>
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
</natures>
</projectDescription>

View File

@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="src" path=""/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.WebProject">
<attributes>
<attribute name="hide" value="true"/>
</attributes>
</classpathentry>
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
<classpathentry kind="output" path=""/>
</classpath>

View File

@ -0,0 +1,213 @@
{
"version": "1.0.0",
"errors": {
"io-error": {
"name": "IO Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an IO Error of type {error_type}",
"description": "Data reading error because of IO error.\n\n How it could be resolved:\n - Fix path if it's not correct."
},
"http-error": {
"name": "HTTP Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source returned an HTTP error with a status code of {status_code}",
"description": "Data reading error because of HTTP error.\n\n How it could be resolved:\n - Fix url link if it's not correct."
},
"source-error": {
"name": "Source Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source has not supported or has inconsistent contents; no tabular data can be extracted",
"description": "Data reading error because of not supported or inconsistent contents.\n\n How it could be resolved:\n - Fix data contents (e.g. change JSON data to array or arrays/objects).\n - Set correct source settings in {validator}."
},
"scheme-error": {
"name": "Scheme Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown scheme; no tabular data can be extracted",
"description": "Data reading error because of incorrect scheme.\n\n How it could be resolved:\n - Fix data scheme (e.g. change scheme from `ftp` to `http`).\n - Set correct scheme in {validator}."
},
"format-error": {
"name": "Format Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source is in an unknown format; no tabular data can be extracted",
"description": "Data reading error because of incorrect format.\n\n How it could be resolved:\n - Fix data format (e.g. change file extension from `txt` to `csv`).\n - Set correct format in {validator}."
},
"encoding-error": {
"name": "Encoding Error",
"type": "source",
"context": "table",
"weight": 100,
"message": "The data source could not be successfully decoded with {encoding} encoding",
"description": "Data reading error because of an encoding problem.\n\n How it could be resolved:\n - Fix data source if it's broken.\n - Set correct encoding in {validator}."
},
"blank-header": {
"name": "Blank Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is blank",
"description": "A column in the header row is missing a value. Column names should be provided.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data source.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `blank-header` check in {validator}."
},
"duplicate-header": {
"name": "Duplicate Header",
"type": "structure",
"context": "head",
"weight": 3,
"message": "Header in column {column_number} is duplicated to header in column(s) {column_numbers}",
"description": "Two columns in the header row have the same value. Column names should be unique.\n\n How it could be resolved:\n - Add the missing column name to the first row of the data.\n - If the first row starts with, or ends with a comma, remove it.\n - If this error should be ignored disable `duplicate-header` check in {validator}."
},
"blank-row": {
"name": "Blank Row",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} is completely blank",
"description": "This row is empty. A row should contain at least one value.\n\n How it could be resolved:\n - Delete the row.\n - If this error should be ignored disable `blank-row` check in {validator}."
},
"duplicate-row": {
"name": "Duplicate Row",
"type": "structure",
"context": "body",
"weight": 5,
"message": "Row {row_number} is duplicated to row(s) {row_numbers}",
"description": "The exact same data has been seen in another row.\n\n How it could be resolved:\n - If some of the data is incorrect, correct it.\n - If the whole row is an incorrect duplicate, remove it.\n - If this error should be ignored disable `duplicate-row` check in {validator}."
},
"extra-value": {
"name": "Extra Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has an extra value in column {column_number}",
"description": "This row has more values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data has an extra comma between the values in this row.\n - If this error should be ignored disable `extra-value` check in {validator}."
},
"missing-value": {
"name": "Missing Value",
"type": "structure",
"context": "body",
"weight": 9,
"message": "Row {row_number} has a missing value in column {column_number}",
"description": "This row has less values compared to the header row (the first row in the data source). A key concept is that all the rows in tabular data must have the same number of columns.\n\n How it could be resolved:\n - Check data is not missing a comma between the values in this row.\n - If this error should be ignored disable `missing-value` check in {validator}."
},
"schema-error": {
"name": "Table Schema Error",
"type": "schema",
"context": "table",
"weight": 15,
"message": "Table Schema error: {error_message}",
"description": "Provided schema is not valid.\n\n How it could be resolved:\n - Update schema descriptor to be a valid descriptor\n - If this error should be ignored disable schema checks in {validator}."
},
"non-matching-header": {
"name": "Non-Matching Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "Header in column {column_number} doesn't match field name {field_name} in the schema",
"description": "One of the data source headers doesn't match the field name defined in the schema.\n\n How it could be resolved:\n - Rename header in the data source or field in the schema\n - If this error should be ignored disable `non-matching-header` check in {validator}."
},
"extra-header": {
"name": "Extra Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is an extra header in column {column_number}",
"description": "The first row of the data source contains header that doesn't exist in the schema.\n\n How it could be resolved:\n - Remove the extra column from the data source or add the missing field to the schema\n - If this error should be ignored disable `extra-header` check in {validator}."
},
"missing-header": {
"name": "Missing Header",
"type": "schema",
"context": "head",
"weight": 9,
"message": "There is a missing header in column {column_number}",
"description": "Based on the schema there should be a header that is missing in the first row of the data source.\n\n How it could be resolved:\n - Add the missing column to the data source or remove the extra field from the schema\n - If this error should be ignored disable `missing-header` check in {validator}."
},
"type-or-format-error": {
"name": "Type or Format Error",
"type": "schema",
"context": "body",
"weight": 9,
"message": "The value {value} in row {row_number} and column {column_number} is not type {field_type} and format {field_format}",
"description": "The value does not match the schema type and format for this field.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If this value is correct, adjust the type and/or format.\n - To ignore the error, disable the `type-or-format-error` check in {validator}. In this case all schema checks for row values will be ignored."
},
"required-constraint": {
"name": "Required Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Column {column_number} is a required field, but row {row_number} has no value",
"description": "This field is a required field, but it contains no value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove the `required` constraint from the schema.\n - If this error should be ignored disable `required-constraint` check in {validator}."
},
"pattern-constraint": {
"name": "Pattern Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the pattern constraint of {constraint}",
"description": "This field value should conform to constraint pattern.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `pattern` constraint in the schema.\n - If this error should be ignored disable `pattern-constraint` check in {validator}."
},
"unique-constraint": {
"name": "Unique Constraint",
"type": "schema",
"context": "body",
"weight": 9,
"message": "Rows {row_numbers} has unique constraint violation in column {column_number}",
"description": "This field is a unique field but it contains a value that has been used in another row.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then the values in this column are not unique. Remove the `unique` constraint from the schema.\n - If this error should be ignored disable `unique-constraint` check in {validator}."
},
"enumerable-constraint": {
"name": "Enumerable Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the given enumeration: {constraint}",
"description": "This field value should be equal to one of the values in the enumeration constraint.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `enum` constraint in the schema.\n - If this error should be ignored disable `enumerable-constraint` check in {validator}."
},
"minimum-constraint": {
"name": "Minimum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum constraint of {constraint}",
"description": "This field value should be greater or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimum` constraint in the schema.\n - If this error should be ignored disable `minimum-constraint` check in {validator}."
},
"maximum-constraint": {
"name": "Maximum Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum constraint of {constraint}",
"description": "This field value should be less or equal than constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximum` constraint in the schema.\n - If this error should be ignored disable `maximum-constraint` check in {validator}."
},
"minimum-length-constraint": {
"name": "Minimum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the minimum length constraint of {constraint}",
"description": "A lenght of this field value should be greater or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `minimumLength` constraint in the schema.\n - If this error should be ignored disable `minimum-length-constraint` check in {validator}."
},
"maximum-length-constraint": {
"name": "Maximum Length Constraint",
"type": "schema",
"context": "body",
"weight": 7,
"message": "The value {value} in row {row_number} and column {column_number} does not conform to the maximum length constraint of {constraint}",
"description": "A lenght of this field value should be less or equal than schema constraint value.\n\n How it could be resolved:\n - If this value is not correct, update the value.\n - If value is correct, then remove or refine the `maximumLength` constraint in the schema.\n - If this error should be ignored disable `maximum-length-constraint` check in {validator}."
}
}
}

View File

@ -0,0 +1,16 @@
{
"image": "",
"license": "",
"last_updated": "",
"keywords": [],
"sources": [{
"web": "",
"name": "",
"title": ""
}],
"name": "",
"description": "",
"resources": [],
"title": "",
"version": ""
}

View File

@ -37,6 +37,7 @@ import java.io.IOException;
import java.io.InputStream; import java.io.InputStream;
import java.time.LocalDateTime; import java.time.LocalDateTime;
import java.time.ZoneId; import java.time.ZoneId;
import java.time.temporal.ChronoUnit;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.Comparator; import java.util.Comparator;
@ -45,7 +46,7 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import org.apache.commons.lang.exception.ExceptionUtils; import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.tools.tar.TarOutputStream; import org.apache.tools.tar.TarOutputStream;
import org.json.JSONArray; import org.json.JSONArray;
import org.json.JSONException; import org.json.JSONException;
@ -55,6 +56,8 @@ import org.slf4j.LoggerFactory;
import com.google.refine.history.HistoryEntryManager; import com.google.refine.history.HistoryEntryManager;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.PreferenceStore; import com.google.refine.preference.PreferenceStore;
import com.google.refine.preference.TopList; import com.google.refine.preference.TopList;
@ -73,7 +76,6 @@ public abstract class ProjectManager {
// Don't spend more than this much time saving projects if doing a quick save // Don't spend more than this much time saving projects if doing a quick save
static protected final int QUICK_SAVE_MAX_TIME = 1000 * 30; // 30 secs static protected final int QUICK_SAVE_MAX_TIME = 1000 * 30; // 30 secs
protected Map<Long, ProjectMetadata> _projectsMetadata; protected Map<Long, ProjectMetadata> _projectsMetadata;
protected Map<String, Integer> _projectsTags;// TagName, number of projects having that tag protected Map<String, Integer> _projectsTags;// TagName, number of projects having that tag
protected PreferenceStore _preferenceStore; protected PreferenceStore _preferenceStore;
@ -99,8 +101,8 @@ public abstract class ProjectManager {
transient protected Map<Long, Project> _projects; transient protected Map<Long, Project> _projects;
static public ProjectManager singleton; static public ProjectManager singleton;
protected ProjectManager(){ protected ProjectManager() {
_projectsMetadata = new HashMap<Long, ProjectMetadata>(); _projectsMetadata = new HashMap<Long, ProjectMetadata>();
_preferenceStore = new PreferenceStore(); _preferenceStore = new PreferenceStore();
_projects = new HashMap<Long, Project>(); _projects = new HashMap<Long, Project>();
@ -191,7 +193,7 @@ public abstract class ProjectManager {
} catch (Exception e) { } catch (Exception e) {
e.printStackTrace(); e.printStackTrace();
} }
}//FIXME what should be the behaviour if metadata is null? i.e. not found }
Project project = getProject(id); Project project = getProject(id);
if (project != null && metadata != null && metadata.getModified().isAfter(project.getLastSave())) { if (project != null && metadata != null && metadata.getModified().isAfter(project.getLastSave())) {
@ -200,8 +202,7 @@ public abstract class ProjectManager {
} catch (Exception e) { } catch (Exception e) {
e.printStackTrace(); e.printStackTrace();
} }
}//FIXME what should be the behaviour if project is null? i.e. not found or loaded. }
//FIXME what should happen if the metadata is found, but not the project? or vice versa?
} }
} }
@ -212,7 +213,7 @@ public abstract class ProjectManager {
* @param projectId * @param projectId
* @throws Exception * @throws Exception
*/ */
public abstract void saveMetadata(ProjectMetadata metadata, long projectId) throws Exception; public abstract void saveMetadata(IMetadata metadata, long projectId) throws Exception;
/** /**
* Save project to the data store * Save project to the data store
@ -265,23 +266,23 @@ public abstract class ProjectManager {
Project project = _projects.get(id); // don't call getProject() as that will load the project. Project project = _projects.get(id); // don't call getProject() as that will load the project.
if (project != null) { if (project != null) {
LocalDateTime projectLastSaveTime = project.getLastSave();
boolean hasUnsavedChanges = boolean hasUnsavedChanges =
metadata.getModified().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() >= project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli(); !metadata.getModified().isBefore(projectLastSaveTime);
// We use >= instead of just > to avoid the case where a newly created project // We use >= instead of just > to avoid the case where a newly created project
// has the same modified and last save times, resulting in the project not getting // has the same modified and last save times, resulting in the project not getting
// saved at all. // saved at all.
if (hasUnsavedChanges) { if (hasUnsavedChanges) {
long msecsOverdue = startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() - project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli(); long msecsOverdue = ChronoUnit.MILLIS.between(projectLastSaveTime, startTimeOfSave);
records.add(new SaveRecord(project, msecsOverdue)); records.add(new SaveRecord(project, msecsOverdue));
} else if (!project.getProcessManager().hasPending() } else if (!project.getProcessManager().hasPending()
&& startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() - project.getLastSave().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() > PROJECT_FLUSH_DELAY) { && ChronoUnit.MILLIS.between(projectLastSaveTime, startTimeOfSave) > PROJECT_FLUSH_DELAY) {
/* /*
* It's been a while since the project was last saved and it hasn't been * It's been a while since the project was last saved and it hasn't been
* modified. We can safely remove it from the cache to save some memory. * modified. We can safely remove it from the cache to save some memory.
*/ */
_projects.remove(id).dispose(); _projects.remove(id).dispose();
} }
@ -307,13 +308,10 @@ public abstract class ProjectManager {
"Saving all modified projects ..." : "Saving all modified projects ..." :
"Saving some modified projects ..." "Saving some modified projects ..."
); );
for (int i = 0; for (int i = 0;i < records.size() &&
i < records.size() && (allModified || (ChronoUnit.MILLIS.between(startTimeOfSave, LocalDateTime.now()) < QUICK_SAVE_MAX_TIME));
(allModified || (LocalDateTime.now().atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() -
startTimeOfSave.atZone(ZoneId.systemDefault()).toInstant().toEpochMilli() < QUICK_SAVE_MAX_TIME));
i++) { i++) {
try { try {
saveProject(records.get(i).project); saveProject(records.get(i).project);
} catch (Exception e) { } catch (Exception e) {
@ -351,14 +349,14 @@ public abstract class ProjectManager {
/** /**
* Gets the project metadata from memory * Gets the project metadata from memory
* Requires that the metadata has already been loaded from the data store * Requires that the metadata has already been loaded from the data store.
* @param id * @param id
* @return * @return
*/ */
public ProjectMetadata getProjectMetadata(long id) { public ProjectMetadata getProjectMetadata(long id) {
return _projectsMetadata.get(id); return _projectsMetadata.get(id);
} }
/** /**
* Gets the project metadata from memory * Gets the project metadata from memory
* Requires that the metadata has already been loaded from the data store * Requires that the metadata has already been loaded from the data store
@ -368,7 +366,7 @@ public abstract class ProjectManager {
public ProjectMetadata getProjectMetadata(String name) { public ProjectMetadata getProjectMetadata(String name) {
for (ProjectMetadata pm : _projectsMetadata.values()) { for (ProjectMetadata pm : _projectsMetadata.values()) {
if (pm.getName().equals(name)) { if (pm.getName().equals(name)) {
return pm; return pm;
} }
} }
return null; return null;
@ -420,7 +418,7 @@ public abstract class ProjectManager {
userMetadataPreference = new JSONArray(userMeta); userMetadataPreference = new JSONArray(userMeta);
} catch (JSONException e1) { } catch (JSONException e1) {
logger.warn("wrong definition of userMetadata format. Please use form [{\"name\": \"client name\", \"display\":true}, {\"name\": \"progress\", \"display\":false}]"); logger.warn("wrong definition of userMetadata format. Please use form [{\"name\": \"client name\", \"display\":true}, {\"name\": \"progress\", \"display\":false}]");
logger.error(ExceptionUtils.getFullStackTrace(e1)); logger.error(ExceptionUtils.getStackTrace(e1));
} }
for (int index = 0; index < userMetadataPreference.length(); index++) { for (int index = 0; index < userMetadataPreference.length(); index++) {
@ -465,7 +463,7 @@ public abstract class ProjectManager {
JSONObject projectMetaJsonObj = jsonObjArray.getJSONObject(index); JSONObject projectMetaJsonObj = jsonObjArray.getJSONObject(index);
projectMetaJsonObj.put("display", false); projectMetaJsonObj.put("display", false);
} catch (JSONException e) { } catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e)); logger.error(ExceptionUtils.getStackTrace(e));
} }
} }
} }
@ -474,6 +472,7 @@ public abstract class ProjectManager {
* Gets all the project Metadata currently held in memory. * Gets all the project Metadata currently held in memory.
* @return * @return
*/ */
public Map<Long, ProjectMetadata> getAllProjectMetadata() { public Map<Long, ProjectMetadata> getAllProjectMetadata() {
for(Project project : _projects.values()) { for(Project project : _projects.values()) {
mergeEmptyUserMetadata(project.getMetadata()); mergeEmptyUserMetadata(project.getMetadata());
@ -484,13 +483,14 @@ public abstract class ProjectManager {
/** /**
* Gets all the project tags currently held in memory * Gets all the project tags currently held in memory
* *
* @return * @return
*/ */
public Map<String, Integer> getAllProjectTags() { public Map<String, Integer> getAllProjectTags() {
return _projectsTags; return _projectsTags;
} }
/** /**
* Gets the required project from the data store * Gets the required project from the data store
* If project does not already exist in memory, it is loaded from the data store * If project does not already exist in memory, it is loaded from the data store
@ -596,8 +596,9 @@ public abstract class ProjectManager {
* *
* @param ps * @param ps
*/ */
static protected void preparePreferenceStore(PreferenceStore ps) { public static void preparePreferenceStore(PreferenceStore ps) {
ps.put("scripting.expressions", new TopList(s_expressionHistoryMax)); ps.put("scripting.expressions", new TopList(s_expressionHistoryMax));
ps.put("scripting.starred-expressions", new TopList(Integer.MAX_VALUE)); ps.put("scripting.starred-expressions", new TopList(Integer.MAX_VALUE));
} }
} }

View File

@ -37,7 +37,7 @@ import java.util.Iterator;
import java.util.TreeSet; import java.util.TreeSet;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
public class FingerprintKeyer extends Keyer { public class FingerprintKeyer extends Keyer {

View File

@ -52,11 +52,11 @@ import org.slf4j.LoggerFactory;
import com.google.refine.Jsonizable; import com.google.refine.Jsonizable;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.browsing.Engine; import com.google.refine.browsing.Engine;
import com.google.refine.history.HistoryEntry; import com.google.refine.history.HistoryEntry;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.process.Process; import com.google.refine.process.Process;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;
@ -194,7 +194,7 @@ public abstract class Command {
* @return * @return
* @throws ServletException * @throws ServletException
*/ */
protected ProjectMetadata getProjectMetadata(HttpServletRequest request) throws ServletException { protected ProjectMetadata getMetadata(HttpServletRequest request) throws ServletException {
if (request == null) { if (request == null) {
throw new IllegalArgumentException("parameter 'request' should not be null"); throw new IllegalArgumentException("parameter 'request' should not be null");
} }
@ -312,7 +312,20 @@ public abstract class Command {
w.flush(); w.flush();
w.close(); w.close();
} }
static protected void respondJSONObject(
HttpServletResponse response, JSONObject o)
throws IOException, JSONException {
response.setCharacterEncoding("UTF-8");
response.setHeader("Content-Type", "application/json");
response.setHeader("Cache-Control", "no-cache");
Writer w = response.getWriter();
w.append(o.toString());
w.flush();
w.close();
}
static protected void respondException(HttpServletResponse response, Exception e) static protected void respondException(HttpServletResponse response, Exception e)
throws IOException, ServletException { throws IOException, ServletException {

View File

@ -54,9 +54,7 @@ public class GetPreferenceCommand extends Command {
throws ServletException, IOException { throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null; Project project = request.getParameter("project") != null ? getProject(request) : null;
PreferenceStore ps = project != null ? PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
project.getMetadata().getPreferenceStore() :
ProjectManager.singleton.getPreferenceStore();
String prefName = request.getParameter("name"); String prefName = request.getParameter("name");
Object pref = ps.get(prefName); Object pref = ps.get(prefName);

View File

@ -52,9 +52,7 @@ public class SetPreferenceCommand extends Command {
throws ServletException, IOException { throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null; Project project = request.getParameter("project") != null ? getProject(request) : null;
PreferenceStore ps = project != null ? PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
project.getMetadata().getPreferenceStore() :
ProjectManager.singleton.getPreferenceStore();
String prefName = request.getParameter("name"); String prefName = request.getParameter("name");
String valueString = request.getParameter("value"); String valueString = request.getParameter("value");

View File

@ -63,7 +63,7 @@ public class GetExpressionHistoryCommand extends Command {
try { try {
Project project = getProject(request); Project project = getProject(request);
List<String> localExpressions = toExpressionList(project.getMetadata().getPreferenceStore().get("scripting.expressions")); List<String> localExpressions = toExpressionList(ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"));
localExpressions = localExpressions.size() > 20 ? localExpressions.subList(0, 20) : localExpressions; localExpressions = localExpressions.size() > 20 ? localExpressions.subList(0, 20) : localExpressions;
List<String> globalExpressions = toExpressionList(ProjectManager.singleton.getPreferenceStore().get("scripting.expressions")); List<String> globalExpressions = toExpressionList(ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"));

View File

@ -54,7 +54,7 @@ public class LogExpressionCommand extends Command {
Project project = getProject(request); Project project = getProject(request);
String expression = request.getParameter("expression"); String expression = request.getParameter("expression");
((TopList) project.getMetadata().getPreferenceStore().get("scripting.expressions")) ((TopList) ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"))
.add(expression); .add(expression);
((TopList) ProjectManager.singleton.getPreferenceStore().get("scripting.expressions")) ((TopList) ProjectManager.singleton.getPreferenceStore().get("scripting.expressions"))

View File

@ -41,8 +41,8 @@ import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpServletResponse;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class DeleteProjectCommand extends Command { public class DeleteProjectCommand extends Command {

View File

@ -0,0 +1,48 @@
package com.google.refine.commands.project;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.everit.json.schema.ValidationException;
import org.json.JSONException;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFactory;
import com.google.refine.model.medadata.MetadataFormat;
public class GetMetadataCommand extends Command {
@Override
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
try {
Project project;
MetadataFormat metadataFormat;
try {
project = getProject(request);
metadataFormat = MetadataFormat.valueOf(request.getParameter("metadataFormat"));
} catch (ServletException e) {
respond(response, "error", e.getLocalizedMessage());
return;
}
// for now, only the data package metadata is supported.
if (metadataFormat != MetadataFormat.DATAPACKAGE_METADATA) {
respond(response, "error", "metadata format is not supported");
return;
}
IMetadata metadata = MetadataFactory.buildDataPackageMetadata(project);
respondJSONObject(response, metadata.getJSON());
} catch (JSONException e) {
respondException(response, e);
} catch (ValidationException e) {
respondException(response, e);
}
}
}

View File

@ -51,9 +51,9 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;
public class ImportProjectCommand extends Command { public class ImportProjectCommand extends Command {

View File

@ -0,0 +1,83 @@
package com.google.refine.commands.project;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.io.Writer;
import java.util.zip.GZIPOutputStream;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.apache.commons.io.IOUtils;
import org.apache.tools.tar.TarOutputStream;
import com.google.refine.ProjectManager;
import com.google.refine.browsing.Engine;
import com.google.refine.commands.Command;
import com.google.refine.exporters.CsvExporter;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.PackageExtension;
public class PackageProjectCommand extends Command {
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
ProjectManager.singleton.setBusy(true);
try {
// get the metadata
String metadata = request.getParameter("metadata");
InputStream in = IOUtils.toInputStream(metadata, "UTF-8");
Project project = getProject(request);
Engine engine = getEngine(request, project);
// ensure project get saved
DataPackageMetadata dpm = new DataPackageMetadata();
dpm.loadFromStream(in);
ProjectManager.singleton.ensureProjectSaved(project.id);
// export project
CsvExporter exporter = new CsvExporter();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
Writer outputStreamWriter = new OutputStreamWriter(baos);
exporter.export(project, null, engine, outputStreamWriter);
OutputStream os = response.getOutputStream();
try {
PackageExtension.saveZip(dpm.getPackage(), baos, os);
response.setHeader("Content-Type", "application/x-gzip");
} finally {
outputStreamWriter.close();
os.close();
}
} catch (Exception e) {
respondException(response, e);
} finally {
ProjectManager.singleton.setBusy(false);
}
}
protected void gzipTarToOutputStream(Project project, OutputStream os) throws IOException {
GZIPOutputStream gos = new GZIPOutputStream(os);
try {
tarToOutputStream(project, gos);
} finally {
gos.close();
}
}
protected void tarToOutputStream(Project project, OutputStream os) throws IOException {
TarOutputStream tos = new TarOutputStream(os);
try {
ProjectManager.singleton.exportProject(project.id, tos);
} finally {
tos.close();
}
}
}

View File

@ -39,8 +39,8 @@ import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpServletResponse;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class RenameProjectCommand extends Command { public class RenameProjectCommand extends Command {
@Override @Override
@ -49,7 +49,7 @@ public class RenameProjectCommand extends Command {
try { try {
String name = request.getParameter("name"); String name = request.getParameter("name");
ProjectMetadata pm = getProjectMetadata(request); ProjectMetadata pm = getMetadata(request);
pm.setName(name); pm.setName(name);

View File

@ -9,15 +9,14 @@ import javax.servlet.http.HttpServletResponse;
import org.json.JSONException; import org.json.JSONException;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class SetProjectMetadataCommand extends Command { public class SetProjectMetadataCommand extends Command {
@Override @Override
public void doPost(HttpServletRequest request, HttpServletResponse response) public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException { throws ServletException, IOException {
Project project = request.getParameter("project") != null ? getProject(request) : null; Project project = request.getParameter("project") != null ? getProject(request) : null;
String metaName = request.getParameter("name"); String metaName = request.getParameter("name");
String valueString = request.getParameter("value"); String valueString = request.getParameter("value");
@ -33,7 +32,7 @@ public class SetProjectMetadataCommand extends Command {
response.setCharacterEncoding("UTF-8"); response.setCharacterEncoding("UTF-8");
response.setHeader("Content-Type", "application/json"); response.setHeader("Content-Type", "application/json");
meta.setAnyField(metaName, valueString); meta.setAnyStringField(metaName, valueString);
ProjectManager.singleton.saveMetadata(meta, project.id); ProjectManager.singleton.saveMetadata(meta, project.id);
respond(response, "{ \"code\" : \"ok\" }"); respond(response, "{ \"code\" : \"ok\" }");

View File

@ -37,9 +37,9 @@ import javax.servlet.http.HttpServletResponse;
import org.json.JSONException; import org.json.JSONException;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class SetProjectTagsCommand extends Command { public class SetProjectTagsCommand extends Command {
@Override @Override

View File

@ -0,0 +1,42 @@
package com.google.refine.commands.project;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.json.JSONException;
import org.json.JSONObject;
import com.google.refine.ProjectManager;
import com.google.refine.commands.Command;
import com.google.refine.model.Project;
import com.google.refine.model.medadata.validator.ValidateOperation;
import com.google.refine.util.ParsingUtilities;
public class ValidateSchemaCommand extends Command {
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
ProjectManager.singleton.setBusy(true);
try {
Project project = getProject(request);
JSONObject optionObj = ParsingUtilities.evaluateJsonStringToObject(
request.getParameter("options"));
new ValidateOperation(project, optionObj).startProcess();
respond(response, "{ \"code\" : \"ok\" }");
} catch (JSONException e) {
respondException(response, e);
} catch (ServletException e) {
respond(response, "error", e.getLocalizedMessage());
return;
} finally {
ProjectManager.singleton.setBusy(false);
}
}
}

View File

@ -180,9 +180,9 @@ public class GetRowsCommand extends Command {
} }
// metadata refresh for row mode and record mode // metadata refresh for row mode and record mode
if (project.getMetadata() != null) { if (project.getMetadata() != null) {
project.getMetadata().setRowCount(project.rows.size()); project.getMetadata().setRowCount(project.rows.size());
} }
} catch (Exception e) { } catch (Exception e) {
respondException(response, e); respondException(response, e);
} }

View File

@ -47,8 +47,8 @@ import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.commands.Command; import com.google.refine.commands.Command;
import com.google.refine.model.medadata.ProjectMetadata;
public class GetAllProjectMetadataCommand extends Command { public class GetAllProjectMetadataCommand extends Command {
@Override @Override

View File

@ -44,7 +44,7 @@ import java.util.Map;
import java.util.Properties; import java.util.Properties;
import java.util.TimeZone; import java.util.TimeZone;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONArray; import org.json.JSONArray;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONObject; import org.json.JSONObject;

View File

@ -38,7 +38,7 @@ import java.io.Writer;
import java.util.List; import java.util.List;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils; import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
@ -103,7 +103,7 @@ public class HtmlTableExporter implements WriterExporter {
if (cellData.link != null) { if (cellData.link != null) {
writer.write("<a href=\""); writer.write("<a href=\"");
// TODO: The escape below looks wrong, but is probably harmless in most cases // TODO: The escape below looks wrong, but is probably harmless in most cases
writer.write(StringEscapeUtils.escapeHtml(cellData.link)); writer.write(StringEscapeUtils.escapeHtml4(cellData.link));
writer.write("\">"); writer.write("\">");
} }
writer.write(StringEscapeUtils.escapeXml(cellData.text)); writer.write(StringEscapeUtils.escapeXml(cellData.text));

View File

@ -42,7 +42,7 @@ import java.util.GregorianCalendar;
import java.util.Locale; import java.util.Locale;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.util.Calendar;
import java.util.Date; import java.util.Date;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.io.UnsupportedEncodingException;
import java.net.URLEncoder; import java.net.URLEncoder;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils; import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;
@ -65,13 +65,13 @@ public class Escape implements Function {
if (o2 instanceof String) { if (o2 instanceof String) {
String mode = ((String) o2).toLowerCase(); String mode = ((String) o2).toLowerCase();
if ("html".equals(mode)) { if ("html".equals(mode)) {
return StringEscapeUtils.escapeHtml(s); return StringEscapeUtils.escapeHtml4(s);
} else if ("xml".equals(mode)) { } else if ("xml".equals(mode)) {
return StringEscapeUtils.escapeXml(s); return StringEscapeUtils.escapeXml11(s);
} else if ("csv".equals(mode)) { } else if ("csv".equals(mode)) {
return StringEscapeUtils.escapeCsv(s); return StringEscapeUtils.escapeCsv(s);
} else if ("javascript".equals(mode)) { } else if ("javascript".equals(mode)) {
return StringEscapeUtils.escapeJavaScript(s); return StringEscapeUtils.escapeEcmaScript(s);
} else if ("url".equals(mode)) { } else if ("url".equals(mode)) {
try { try {
return URLEncoder.encode(s,"UTF-8"); return URLEncoder.encode(s,"UTF-8");

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -40,11 +40,11 @@ import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.EvalError; import com.google.refine.expr.EvalError;
import com.google.refine.grel.ControlFunctionRegistry; import com.google.refine.grel.ControlFunctionRegistry;
import com.google.refine.grel.Function; import com.google.refine.grel.Function;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public class Reinterpret implements Function { public class Reinterpret implements Function {

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -36,7 +36,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -35,7 +35,7 @@ package com.google.refine.expr.functions.strings;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.WordUtils; import org.apache.commons.lang3.text.WordUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;

View File

@ -37,7 +37,7 @@ import java.io.UnsupportedEncodingException;
import java.net.URLDecoder; import java.net.URLDecoder;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang.StringEscapeUtils; import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;
@ -56,13 +56,13 @@ public class Unescape implements Function {
String s = (String) o1; String s = (String) o1;
String mode = ((String) o2).toLowerCase(); String mode = ((String) o2).toLowerCase();
if ("html".equals(mode)) { if ("html".equals(mode)) {
return StringEscapeUtils.unescapeHtml(s); return StringEscapeUtils.unescapeHtml4(s);
} else if ("xml".equals(mode)) { } else if ("xml".equals(mode)) {
return StringEscapeUtils.unescapeXml(s); return StringEscapeUtils.unescapeXml(s);
} else if ("csv".equals(mode)) { } else if ("csv".equals(mode)) {
return StringEscapeUtils.unescapeCsv(s); return StringEscapeUtils.unescapeCsv(s);
} else if ("javascript".equals(mode)) { } else if ("javascript".equals(mode)) {
return StringEscapeUtils.unescapeJavaScript(s); return StringEscapeUtils.escapeEcmaScript(s);
} else if ("url".equals(mode)) { } else if ("url".equals(mode)) {
try { try {
return URLDecoder.decode(s,"UTF-8"); return URLDecoder.decode(s,"UTF-8");

View File

@ -33,7 +33,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.grel.controls; package com.google.refine.grel.controls;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
public class IsNumeric extends IsTest { public class IsNumeric extends IsTest {
@Override @Override

View File

@ -44,7 +44,7 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.commons.lang.exception.ExceptionUtils; import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.poi.POIXMLDocument; import org.apache.poi.POIXMLDocument;
import org.apache.poi.POIXMLException; import org.apache.poi.POIXMLException;
import org.apache.poi.common.usermodel.Hyperlink; import org.apache.poi.common.usermodel.Hyperlink;
@ -60,13 +60,13 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Cell; import com.google.refine.model.Cell;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Recon; import com.google.refine.model.Recon;
import com.google.refine.model.Recon.Judgment; import com.google.refine.model.Recon.Judgment;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.ReconCandidate; import com.google.refine.model.ReconCandidate;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
@ -191,7 +191,7 @@ public class ExcelImporter extends TabularImportingParserBase {
// value is fileName#sheetIndex // value is fileName#sheetIndex
fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#"); fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#");
} catch (JSONException e) { } catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e)); logger.error(ExceptionUtils.getStackTrace(e));
} }
if (!fileNameAndSheetIndex[0].equals(fileSource)) if (!fileNameAndSheetIndex[0].equals(fileSource))

View File

@ -14,10 +14,10 @@ import java.util.List;
import org.json.JSONArray; import org.json.JSONArray;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class FixedWidthImporter extends TabularImportingParserBase { public class FixedWidthImporter extends TabularImportingParserBase {

View File

@ -44,7 +44,6 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress; import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingParser; import com.google.refine.importing.ImportingParser;
@ -52,6 +51,7 @@ import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Column; import com.google.refine.model.Column;
import com.google.refine.model.ModelException; import com.google.refine.model.ModelException;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
abstract public class ImportingParserBase implements ImportingParser { abstract public class ImportingParserBase implements ImportingParser {

View File

@ -49,7 +49,6 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.tree.ImportColumnGroup; import com.google.refine.importers.tree.ImportColumnGroup;
import com.google.refine.importers.tree.TreeImportingParserBase; import com.google.refine.importers.tree.TreeImportingParserBase;
import com.google.refine.importers.tree.TreeReader; import com.google.refine.importers.tree.TreeReader;
@ -57,6 +56,7 @@ import com.google.refine.importers.tree.TreeReaderException;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class JsonImporter extends TreeImportingParserBase { public class JsonImporter extends TreeImportingParserBase {

View File

@ -10,9 +10,9 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class LineBasedImporter extends TabularImportingParserBase { public class LineBasedImporter extends TabularImportingParserBase {

View File

@ -44,7 +44,7 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.commons.lang.exception.ExceptionUtils; import org.apache.commons.lang3.exception.ExceptionUtils;
import org.json.JSONArray; import org.json.JSONArray;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONObject; import org.json.JSONObject;
@ -55,13 +55,13 @@ import org.odftoolkit.odfdom.doc.table.OdfTableRow;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Cell; import com.google.refine.model.Cell;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Recon; import com.google.refine.model.Recon;
import com.google.refine.model.Recon.Judgment; import com.google.refine.model.Recon.Judgment;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.ReconCandidate; import com.google.refine.model.ReconCandidate;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
@ -150,7 +150,7 @@ public class OdsImporter extends TabularImportingParserBase {
// value is fileName#sheetIndex // value is fileName#sheetIndex
fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#"); fileNameAndSheetIndex = sheetObj.getString("fileNameAndSheetIndex").split("#");
} catch (JSONException e) { } catch (JSONException e) {
logger.error(ExceptionUtils.getFullStackTrace(e)); logger.error(ExceptionUtils.getStackTrace(e));
} }
if (!fileNameAndSheetIndex[0].equals(fileSource)) if (!fileNameAndSheetIndex[0].equals(fileSource))

View File

@ -50,7 +50,6 @@ import org.jrdf.parser.RdfReader;
import org.jrdf.util.ClosableIterable; import org.jrdf.util.ClosableIterable;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.ExpressionUtils; import com.google.refine.expr.ExpressionUtils;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell; import com.google.refine.model.Cell;
@ -58,6 +57,7 @@ import com.google.refine.model.Column;
import com.google.refine.model.ModelException; import com.google.refine.model.ModelException;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Row; import com.google.refine.model.Row;
import com.google.refine.model.medadata.ProjectMetadata;
public class RdfTripleImporter extends ImportingParserBase { public class RdfTripleImporter extends ImportingParserBase {
private RdfReader rdfReader; private RdfReader rdfReader;

View File

@ -49,15 +49,15 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.commons.lang.StringEscapeUtils; import org.apache.commons.lang3.StringEscapeUtils;
import org.json.JSONObject; import org.json.JSONObject;
import au.com.bytecode.opencsv.CSVParser; import au.com.bytecode.opencsv.CSVParser;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class SeparatorBasedImporter extends TabularImportingParserBase { public class SeparatorBasedImporter extends TabularImportingParserBase {

View File

@ -41,13 +41,13 @@ import java.util.List;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.expr.ExpressionUtils; import com.google.refine.expr.ExpressionUtils;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell; import com.google.refine.model.Cell;
import com.google.refine.model.Column; import com.google.refine.model.Column;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Row; import com.google.refine.model.Row;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
abstract public class TabularImportingParserBase extends ImportingParserBase { abstract public class TabularImportingParserBase extends ImportingParserBase {

View File

@ -56,13 +56,13 @@ import org.sweble.wikitext.parser.preprocessor.PreprocessedWikitext;
import xtc.parser.ParseException; import xtc.parser.ParseException;
import com.google.refine.ProjectMetadata;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.model.Cell; import com.google.refine.model.Cell;
import com.google.refine.model.Column; import com.google.refine.model.Column;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Recon; import com.google.refine.model.Recon;
import com.google.refine.model.ReconStats; import com.google.refine.model.ReconStats;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.model.recon.StandardReconConfig.ColumnDetail; import com.google.refine.model.recon.StandardReconConfig.ColumnDetail;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
import com.google.refine.model.recon.StandardReconConfig; import com.google.refine.model.recon.StandardReconConfig;

View File

@ -51,7 +51,6 @@ import org.json.JSONObject;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.tree.ImportColumnGroup; import com.google.refine.importers.tree.ImportColumnGroup;
import com.google.refine.importers.tree.TreeImportingParserBase; import com.google.refine.importers.tree.TreeImportingParserBase;
import com.google.refine.importers.tree.TreeReader; import com.google.refine.importers.tree.TreeReader;
@ -59,6 +58,7 @@ import com.google.refine.importers.tree.TreeReaderException;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
public class XmlImporter extends TreeImportingParserBase { public class XmlImporter extends TreeImportingParserBase {

View File

@ -3,7 +3,7 @@ package com.google.refine.importers.tree;
import java.util.LinkedHashMap; import java.util.LinkedHashMap;
import java.util.Map; import java.util.Map;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
/** /**
* A column group describes a branch in tree structured data * A column group describes a branch in tree structured data

View File

@ -39,16 +39,16 @@ import java.io.InputStream;
import java.io.Reader; import java.io.Reader;
import java.util.List; import java.util.List;
import org.apache.commons.lang.NotImplementedException; import org.apache.commons.lang3.NotImplementedException;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.importers.ImporterUtilities; import com.google.refine.importers.ImporterUtilities;
import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress; import com.google.refine.importers.ImporterUtilities.MultiFileReadingProgress;
import com.google.refine.importers.ImportingParserBase; import com.google.refine.importers.ImportingParserBase;
import com.google.refine.importing.ImportingJob; import com.google.refine.importing.ImportingJob;
import com.google.refine.importing.ImportingUtilities; import com.google.refine.importing.ImportingUtilities;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
/** /**
@ -154,7 +154,7 @@ abstract public class TreeImportingParserBase extends ImportingParserBase {
JSONObject options, JSONObject options,
List<Exception> exceptions List<Exception> exceptions
) { ) {
throw new NotImplementedException(); throw new NotImplementedException("project ID:" + project.id);
} }
/** /**

View File

@ -271,7 +271,15 @@ public class DefaultImportingController implements ImportingController {
throw new ServletException(e); throw new ServletException(e);
} }
} }
/**
* return the job to the front end.
* @param request
* @param response
* @param job
* @throws ServletException
* @throws IOException
*/
private void replyWithJobData(HttpServletRequest request, HttpServletResponse response, ImportingJob job) private void replyWithJobData(HttpServletRequest request, HttpServletResponse response, ImportingJob job)
throws ServletException, IOException { throws ServletException, IOException {

View File

@ -47,8 +47,8 @@ import org.json.JSONWriter;
import com.google.refine.Jsonizable; import com.google.refine.Jsonizable;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
@ -139,6 +139,14 @@ public class ImportingJob implements Jsonizable {
} }
} }
/**
* TO check if the file record is a metadata file entry
* @param fileRecordObject
* @return JSONObject
*/
public boolean isMetadataFileRecord(JSONObject fileRecordObject) {
return fileRecordObject.has("metaDataFormat");
}
public List<JSONObject> getSelectedFileRecords() { public List<JSONObject> getSelectedFileRecords() {
List<JSONObject> results = new ArrayList<JSONObject>(); List<JSONObject> results = new ArrayList<JSONObject>();
@ -208,5 +216,4 @@ public class ImportingJob implements Jsonizable {
writer.endObject(); writer.endObject();
} }
} }
} }

View File

@ -37,8 +37,8 @@ import java.util.List;
import org.json.JSONObject; import org.json.JSONObject;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.ProjectMetadata;
public interface ImportingParser { public interface ImportingParser {
/** /**

View File

@ -42,6 +42,7 @@ import java.io.InputStream;
import java.io.InputStreamReader; import java.io.InputStreamReader;
import java.io.Reader; import java.io.Reader;
import java.io.UnsupportedEncodingException; import java.io.UnsupportedEncodingException;
import java.net.URISyntaxException;
import java.net.URL; import java.net.URL;
import java.net.URLConnection; import java.net.URLConnection;
import java.text.NumberFormat; import java.text.NumberFormat;
@ -49,9 +50,11 @@ import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.Comparator; import java.util.Comparator;
import java.util.HashMap; import java.util.HashMap;
import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Properties; import java.util.Properties;
import java.util.stream.Collectors;
import java.util.zip.GZIPInputStream; import java.util.zip.GZIPInputStream;
import java.util.zip.ZipEntry; import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream; import java.util.zip.ZipInputStream;
@ -65,10 +68,14 @@ import org.apache.commons.fileupload.ProgressListener;
import org.apache.commons.fileupload.disk.DiskFileItemFactory; import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload; import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.fileupload.util.Streams; import org.apache.commons.fileupload.util.Streams;
import org.apache.commons.lang.StringUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.http.HttpEntity; import org.apache.http.HttpEntity;
import org.apache.http.HttpResponse; import org.apache.http.HttpResponse;
import org.apache.http.HttpStatus;
import org.apache.http.auth.AuthScope; import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials; import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.HttpGet; import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.DecompressingHttpClient; import org.apache.http.impl.client.DecompressingHttpClient;
import org.apache.http.impl.client.DefaultHttpClient; import org.apache.http.impl.client.DefaultHttpClient;
@ -82,16 +89,35 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.importing.ImportingManager.Format; import com.google.refine.importing.ImportingManager.Format;
import com.google.refine.importing.UrlRewriter.Result; import com.google.refine.importing.UrlRewriter.Result;
import com.google.refine.model.Column;
import com.google.refine.model.ColumnModel;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.Row;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFactory;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.model.medadata.PackageExtension;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.PreferenceStore;
import com.google.refine.util.JSONUtilities; import com.google.refine.util.JSONUtilities;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.tableschema.Field;
import io.frictionlessdata.tableschema.Schema;
import io.frictionlessdata.tableschema.TypeInferrer;
import io.frictionlessdata.tableschema.exceptions.TypeInferringException;
public class ImportingUtilities { public class ImportingUtilities {
final static protected Logger logger = LoggerFactory.getLogger("importing-utilities"); final static protected Logger logger = LoggerFactory.getLogger("importing-utilities");
private final static String METADATA_FILE_KEY = "metadataFile";
private static final int INFER_ROW_LIMIT = 100;
static public interface Progress { static public interface Progress {
public void setProgress(String message, int percent); public void setProgress(String message, int percent);
public boolean isCanceled(); public boolean isCanceled();
@ -172,11 +198,11 @@ public class ImportingUtilities {
) throws Exception { ) throws Exception {
JSONArray fileRecords = new JSONArray(); JSONArray fileRecords = new JSONArray();
JSONUtilities.safePut(retrievalRecord, "files", fileRecords); JSONUtilities.safePut(retrievalRecord, "files", fileRecords);
JSONUtilities.safePut(retrievalRecord, "downloadCount", 0);
JSONUtilities.safePut(retrievalRecord, "archiveCount", 0);
int clipboardCount = 0; int clipboardCount = 0;
int uploadCount = 0; int uploadCount = 0;
int downloadCount = 0;
int archiveCount = 0;
// This tracks the total progress, which involves uploading data from the client // This tracks the total progress, which involves uploading data from the client
// as well as downloading data from URLs. // as well as downloading data from URLs.
@ -220,7 +246,7 @@ public class ImportingUtilities {
List<FileItem> tempFiles = (List<FileItem>)upload.parseRequest(request); List<FileItem> tempFiles = (List<FileItem>)upload.parseRequest(request);
progress.setProgress("Uploading data ...", -1); progress.setProgress("Uploading data ...", -1);
parts: for (FileItem fileItem : tempFiles) { for (FileItem fileItem : tempFiles) {
if (progress.isCanceled()) { if (progress.isCanceled()) {
break; break;
} }
@ -255,107 +281,27 @@ public class ImportingUtilities {
} else if (name.equals("download")) { } else if (name.equals("download")) {
String urlString = Streams.asString(stream); String urlString = Streams.asString(stream);
URL url = new URL(urlString); download(rawDataDir, retrievalRecord, progress, fileRecords, update, urlString);
processDataPackage(retrievalRecord, fileRecords);
JSONObject fileRecord = new JSONObject(); } else if (name.equals("data-package")) {
JSONUtilities.safePut(fileRecord, "origin", "download"); String urlString = Streams.asString(stream);
JSONUtilities.safePut(fileRecord, "url", urlString); List<Result> results = null;
for (UrlRewriter rewriter : ImportingManager.urlRewriters) { for (UrlRewriter rewriter : ImportingManager.urlRewriters) {
Result result = rewriter.rewrite(urlString); results = rewriter.rewrite(urlString);
if (result != null) { if (results != null) {
urlString = result.rewrittenUrl; for (Result result : results) {
url = new URL(urlString); download(rawDataDir, retrievalRecord, progress, fileRecords,
update, result.rewrittenUrl, result.metaDataFormat);
JSONUtilities.safePut(fileRecord, "url", urlString);
JSONUtilities.safePut(fileRecord, "format", result.format);
if (!result.download) {
downloadCount++;
JSONUtilities.append(fileRecords, fileRecord);
continue parts;
} }
} }
} }
if ("http".equals(url.getProtocol()) || "https".equals(url.getProtocol())) {
DefaultHttpClient client = new DefaultHttpClient();
DecompressingHttpClient httpclient =
new DecompressingHttpClient(client);
HttpGet httpGet = new HttpGet(url.toURI());
httpGet.setHeader("User-Agent", RefineServlet.getUserAgent());
if ("https".equals(url.getProtocol())) {
// HTTPS only - no sending password in the clear over HTTP
String userinfo = url.getUserInfo();
if (userinfo != null) {
int s = userinfo.indexOf(':');
if (s > 0) {
String user = userinfo.substring(0, s);
String pw = userinfo.substring(s + 1, userinfo.length());
client.getCredentialsProvider().setCredentials(
new AuthScope(url.getHost(), 443),
new UsernamePasswordCredentials(user, pw));
}
}
}
HttpResponse response = httpclient.execute(httpGet);
try {
response.getStatusLine();
HttpEntity entity = response.getEntity();
if (entity == null) {
throw new Exception("No content found in " + url.toString());
}
InputStream stream2 = entity.getContent();
String encoding = null;
if (entity.getContentEncoding() != null) {
encoding = entity.getContentEncoding().getValue();
}
JSONUtilities.safePut(fileRecord, "declaredEncoding", encoding);
String contentType = null;
if (entity.getContentType() != null) {
contentType = entity.getContentType().getValue();
}
JSONUtilities.safePut(fileRecord, "declaredMimeType", contentType);
if (saveStream(stream2, url, rawDataDir, progress, update,
fileRecord, fileRecords,
entity.getContentLength())) {
archiveCount++;
}
downloadCount++;
EntityUtils.consume(entity);
} finally {
httpGet.releaseConnection();
}
} else {
// Fallback handling for non HTTP connections (only FTP?)
URLConnection urlConnection = url.openConnection();
urlConnection.setConnectTimeout(5000);
urlConnection.connect();
InputStream stream2 = urlConnection.getInputStream();
JSONUtilities.safePut(fileRecord, "declaredEncoding",
urlConnection.getContentEncoding());
JSONUtilities.safePut(fileRecord, "declaredMimeType",
urlConnection.getContentType());
try {
if (saveStream(stream2, url, rawDataDir, progress,
update, fileRecord, fileRecords,
urlConnection.getContentLength())) {
archiveCount++;
}
downloadCount++;
} finally {
stream2.close();
}
}
} else { } else {
String value = Streams.asString(stream); String value = Streams.asString(stream);
parameters.put(name, value); parameters.put(name, value);
// TODO: We really want to store this on the request so it's available for everyone // TODO: We really want to store this on the request so it's available for everyone
// request.getParameterMap().put(name, value); // request.getParameterMap().put(name, value);
} }
} else { // is file content } else { // is file content
String fileName = fileItem.getName(); String fileName = fileItem.getName();
if (fileName.length() > 0) { if (fileName.length() > 0) {
@ -376,9 +322,11 @@ public class ImportingUtilities {
JSONUtilities.safePut(fileRecord, "size", saveStreamToFile(stream, file, null)); JSONUtilities.safePut(fileRecord, "size", saveStreamToFile(stream, file, null));
if (postProcessRetrievedFile(rawDataDir, file, fileRecord, fileRecords, progress)) { if (postProcessRetrievedFile(rawDataDir, file, fileRecord, fileRecords, progress)) {
archiveCount++; JSONUtilities.safeInc(retrievalRecord, "archiveCount");
} }
processDataPackage(retrievalRecord, fileRecords);
uploadCount++; uploadCount++;
} }
} }
@ -392,9 +340,144 @@ public class ImportingUtilities {
} }
JSONUtilities.safePut(retrievalRecord, "uploadCount", uploadCount); JSONUtilities.safePut(retrievalRecord, "uploadCount", uploadCount);
JSONUtilities.safePut(retrievalRecord, "downloadCount", downloadCount);
JSONUtilities.safePut(retrievalRecord, "clipboardCount", clipboardCount); JSONUtilities.safePut(retrievalRecord, "clipboardCount", clipboardCount);
JSONUtilities.safePut(retrievalRecord, "archiveCount", archiveCount); }
private static void processDataPackage(JSONObject retrievalRecord, JSONArray fileRecords) {
int dataPackageJSONFileIndex = getDataPackageJSONFile(fileRecords);
if (dataPackageJSONFileIndex >= 0) {
JSONObject dataPackageJSONFile = (JSONObject) fileRecords.get(dataPackageJSONFileIndex);
JSONUtilities.safePut(dataPackageJSONFile, "metaDataFormat", MetadataFormat.DATAPACKAGE_METADATA.name());
JSONUtilities.safePut(retrievalRecord, METADATA_FILE_KEY, dataPackageJSONFile);
fileRecords.remove(dataPackageJSONFileIndex);
}
}
private static int getDataPackageJSONFile(JSONArray fileRecords) {
for (int i = 0; i < fileRecords.length(); i++) {
JSONObject file = fileRecords.getJSONObject(i);
if (file.has("archiveFileName") &&
file.has("fileName") &&
file.get("fileName").equals(DataPackageMetadata.DEFAULT_FILE_NAME)) {
return i;
}
}
return -1;
}
private static void download(File rawDataDir, JSONObject retrievalRecord, final Progress progress,
JSONArray fileRecords, final SavingUpdate update, String urlString)
throws URISyntaxException, IOException, ClientProtocolException, Exception {
download(rawDataDir, retrievalRecord, progress, fileRecords, update, urlString, null);
}
/**
* @param rawDataDir
* @param retrievalRecord
* @param progress
* @param fileRecords
* @param update
* @param urlString
* @throws URISyntaxException
* @throws IOException
* @throws ClientProtocolException
* @throws Exception
*/
private static void download(File rawDataDir, JSONObject retrievalRecord, final Progress progress,
JSONArray fileRecords, final SavingUpdate update, String urlString, String metaDataFormat)
throws URISyntaxException, IOException, ClientProtocolException, Exception {
URL url = new URL(urlString);
JSONObject fileRecord = new JSONObject();
JSONUtilities.safePut(fileRecord, "origin", "download");
JSONUtilities.safePut(fileRecord, "url", urlString);
if ("http".equals(url.getProtocol()) || "https".equals(url.getProtocol())) {
DefaultHttpClient client = new DefaultHttpClient();
DecompressingHttpClient httpclient =
new DecompressingHttpClient(client);
HttpGet httpGet = new HttpGet(url.toURI());
httpGet.setHeader("User-Agent", RefineServlet.getUserAgent());
if ("https".equals(url.getProtocol())) {
// HTTPS only - no sending password in the clear over HTTP
String userinfo = url.getUserInfo();
if (userinfo != null) {
int s = userinfo.indexOf(':');
if (s > 0) {
String user = userinfo.substring(0, s);
String pw = userinfo.substring(s + 1, userinfo.length());
client.getCredentialsProvider().setCredentials(
new AuthScope(url.getHost(), 443),
new UsernamePasswordCredentials(user, pw));
}
}
}
HttpResponse response = httpclient.execute(httpGet);
try {
int code = response.getStatusLine().getStatusCode();
if (code != HttpStatus.SC_OK) {
throw new Exception("HTTP response code: " + code +
" when accessing URL: "+ url.toString());
}
HttpEntity entity = response.getEntity();
if (entity == null) {
throw new Exception("No content found in " + url.toString());
}
InputStream stream2 = entity.getContent();
String encoding = null;
if (entity.getContentEncoding() != null) {
encoding = entity.getContentEncoding().getValue();
}
JSONUtilities.safePut(fileRecord, "declaredEncoding", encoding);
String contentType = null;
if (entity.getContentType() != null) {
contentType = entity.getContentType().getValue();
}
JSONUtilities.safePut(fileRecord, "declaredMimeType", contentType);
if (saveStream(stream2, url, rawDataDir, progress, update,
fileRecord, fileRecords,
entity.getContentLength())) {
JSONUtilities.safeInc(retrievalRecord, "archiveCount");
}
if (metaDataFormat != null) {
JSONUtilities.safePut(fileRecord, "metaDataFormat", metaDataFormat);
JSONUtilities.safePut(retrievalRecord, METADATA_FILE_KEY, fileRecord);
fileRecords.remove(0);
}
JSONUtilities.safeInc(retrievalRecord, "downloadCount");
EntityUtils.consume(entity);
} finally {
httpGet.releaseConnection();
}
} else {
// Fallback handling for non HTTP connections (only FTP?)
URLConnection urlConnection = url.openConnection();
urlConnection.setConnectTimeout(5000);
urlConnection.connect();
InputStream stream2 = urlConnection.getInputStream();
JSONUtilities.safePut(fileRecord, "declaredEncoding",
urlConnection.getContentEncoding());
JSONUtilities.safePut(fileRecord, "declaredMimeType",
urlConnection.getContentType());
try {
if (saveStream(stream2, url, rawDataDir, progress,
update, fileRecord, fileRecords,
urlConnection.getContentLength())) {
JSONUtilities.safeInc(retrievalRecord, "archiveCount");
}
if (metaDataFormat != null)
JSONUtilities.safePut(fileRecord, "metaDataFormat", metaDataFormat);
JSONUtilities.safeInc(retrievalRecord, "downloadCount");
} finally {
stream2.close();
}
}
} }
private static boolean saveStream(InputStream stream, URL url, File rawDataDir, final Progress progress, private static boolean saveStream(InputStream stream, URL url, File rawDataDir, final Progress progress,
@ -1021,8 +1104,45 @@ public class ImportingUtilities {
if (exceptions.size() == 0) { if (exceptions.size() == 0) {
project.update(); // update all internal models, indexes, caches, etc. project.update(); // update all internal models, indexes, caches, etc.
boolean hasMetadataFileRecord = ((JSONObject)job.getRetrievalRecord()).has(METADATA_FILE_KEY);
if (hasMetadataFileRecord) {
JSONObject metadataFileRecord = (JSONObject) job.getRetrievalRecord().get(METADATA_FILE_KEY);
String metadataFormat = (String)metadataFileRecord.get("metaDataFormat");
IMetadata metadata = MetadataFactory.buildMetadata(MetadataFormat.valueOf(metadataFormat));
String relativePath = metadataFileRecord.getString("location");
File metadataFile = new File(job.getRawDataDir(), relativePath);
metadata.loadFromFile(metadataFile);
// process the data package metadata
if (MetadataFormat.valueOf(metadataFormat) == MetadataFormat.DATAPACKAGE_METADATA) {
populateDataPackageMetadata(project, pm, (DataPackageMetadata) metadata);
}
logger.info(metadataFileRecord.get("metaDataFormat") + " metadata is set for project " + project.id);
}
ProjectManager.singleton.registerProject(project, pm); ProjectManager.singleton.registerProject(project, pm);
// infer the column type
if (project.columnModel.columns.get(0).getType().isEmpty()) {
List<Object[]> listCells = new ArrayList<Object[]>(INFER_ROW_LIMIT);
List<Row> rows = project.rows
.stream()
.limit(INFER_ROW_LIMIT)
.collect(Collectors.toList());
rows.forEach(r->listCells.add(r.cells.toArray()));
try {
JSONObject fieldsJSON = TypeInferrer.getInstance().infer(listCells,
project.columnModel.getColumnNames().toArray(new String[0]),
100);
populateColumnTypes(project.columnModel, fieldsJSON.getJSONArray(Schema.JSON_KEY_FIELDS));
} catch (TypeInferringException e) {
logger.error("infer column type exception.", ExceptionUtils.getStackTrace(e));
}
}
job.setProjectID(project.id); job.setProjectID(project.id);
job.setState("created-project"); job.setState("created-project");
} else { } else {
@ -1032,11 +1152,72 @@ public class ImportingUtilities {
job.updating = false; job.updating = false;
} }
} }
private static void populateDataPackageMetadata(Project project, ProjectMetadata pmd, DataPackageMetadata metadata) {
// project metadata
JSONObject pkg = metadata.getPackage().getJson();
pmd.setName(getDataPackageProperty(pkg, Package.JSON_KEY_NAME));
pmd.setDescription(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_DESCRIPTION));
pmd.setTitle(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_TITLE));
pmd.setHomepage(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_HOMEPAGE));
pmd.setImage(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_IMAGE));
pmd.setLicense(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_LICENSE));
pmd.setVersion(getDataPackageProperty(pkg, PackageExtension.JSON_KEY_VERSION));
if (pkg.has(PackageExtension.JSON_KEY_KEYWORKS)) {
String[] tags = pkg.getJSONArray(PackageExtension.JSON_KEY_KEYWORKS).toList().toArray(new String[0]);
pmd.appendTags(tags);
}
// column model
JSONObject schema = metadata.getPackage().getResources().get(0).getSchema();
if (schema != null) {
populateColumnTypes(project.columnModel, schema.getJSONArray(Schema.JSON_KEY_FIELDS));
}
}
private static String getDataPackageProperty(JSONObject pkg, String key) {
return JSONUtilities.getString(pkg, key, StringUtils.EMPTY);
}
/**
* Populate the column model
* @param columnModel
* @param fieldsJSON
*/
private static void populateColumnTypes(ColumnModel columnModel, JSONArray fieldsJSON) {
int cellIndex = 0;
Iterator<Object> iter = fieldsJSON.iterator();
while(iter.hasNext()){
JSONObject fieldJsonObj = (JSONObject)iter.next();
Field field = new Field(fieldJsonObj);
Column column = columnModel.getColumnByCellIndex(cellIndex);
column.setType(field.getType());
column.setFormat(field.getFormat());
column.setDescription(field.getDescription());
column.setTitle(field.getTitle());
column.setConstraints(field.getConstraints());
cellIndex++;
}
}
/**
* Create project metadata. pull the "USER_NAME" from the PreferenceStore as the creator
* @param optionObj
* @return
*/
static public ProjectMetadata createProjectMetadata(JSONObject optionObj) { static public ProjectMetadata createProjectMetadata(JSONObject optionObj) {
ProjectMetadata pm = new ProjectMetadata(); ProjectMetadata pm = new ProjectMetadata();
PreferenceStore ps = ProjectManager.singleton.getPreferenceStore();
pm.setName(JSONUtilities.getString(optionObj, "projectName", "Untitled")); pm.setName(JSONUtilities.getString(optionObj, "projectName", "Untitled"));
pm.setTags(JSONUtilities.getStringArray(optionObj, "projectTags")); pm.setTags(JSONUtilities.getStringArray(optionObj, "projectTags"));
pm.setTitle(JSONUtilities.getString(optionObj, "title", ""));
pm.setHomepage(JSONUtilities.getString(optionObj, "homepage", ""));
pm.setImage(JSONUtilities.getString(optionObj, "image", ""));
pm.setLicense(JSONUtilities.getString(optionObj, "license", ""));
String encoding = JSONUtilities.getString(optionObj, "encoding", "UTF-8"); String encoding = JSONUtilities.getString(optionObj, "encoding", "UTF-8");
if ("".equals(encoding)) { if ("".equals(encoding)) {
@ -1044,6 +1225,12 @@ public class ImportingUtilities {
encoding = "UTF-8"; encoding = "UTF-8";
} }
pm.setEncoding(encoding); pm.setEncoding(encoding);
if (ps.get(PreferenceStore.USER_NAME) != null) {
String creator = (String) ps.get(PreferenceStore.USER_NAME);
pm.setCreator(creator);
}
return pm; return pm;
} }
} }

View File

@ -33,12 +33,45 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.importing; package com.google.refine.importing;
import java.io.IOException;
import java.net.MalformedURLException;
import java.util.List;
/**
* Given a URL rewrittenUrl, the interface will rewrite it into different URLS based on the rewrittenUrl
* The result will be stored in the Result and can be used for download, parsing etc.
* Typical use is to parse the data package json file.
* @see DataPackageUrlRewriter
*/
public interface UrlRewriter { public interface UrlRewriter {
static public class Result { static public class Result {
public String rewrittenUrl; public String rewrittenUrl;
public String format; public String format;
public boolean download; public boolean download;
public String metaDataFormat;
public Result(String rewrittenUrl, String format, boolean download) {
this.rewrittenUrl = rewrittenUrl;
this.format = format;
this.download = download;
}
public Result(String rewrittenUrl, String format, boolean download, String metaDataFormat) {
this.rewrittenUrl = rewrittenUrl;
this.format = format;
this.download = download;
this.metaDataFormat = metaDataFormat;
}
} }
public Result rewrite(String url); /**
* Parse the url and output the Result
* @param url
* @return
* @throws MalformedURLException
* @throws IOException
*/
public List<Result> rewrite(String url) throws MalformedURLException, IOException;
public boolean filter(String url);
} }

View File

@ -57,9 +57,12 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.history.HistoryEntryManager; import com.google.refine.history.HistoryEntryManager;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.preference.TopList; import com.google.refine.preference.TopList;
@ -120,7 +123,6 @@ public class FileProjectManager extends ProjectManager {
if (metadata == null) { if (metadata == null) {
metadata = ProjectMetadataUtilities.recover(getProjectDir(projectID), projectID); metadata = ProjectMetadataUtilities.recover(getProjectDir(projectID), projectID);
} }
if (metadata != null) { if (metadata != null) {
_projectsMetadata.put(projectID, metadata); _projectsMetadata.put(projectID, metadata);
if (_projectsTags == null) { if (_projectsTags == null) {
@ -155,7 +157,7 @@ public class FileProjectManager extends ProjectManager {
untar(destDir, inputStream); untar(destDir, inputStream);
} }
} }
protected void untar(File destDir, InputStream inputStream) throws IOException { protected void untar(File destDir, InputStream inputStream) throws IOException {
TarInputStream tin = new TarInputStream(inputStream); TarInputStream tin = new TarInputStream(inputStream);
TarEntry tarEntry = null; TarEntry tarEntry = null;
@ -231,9 +233,19 @@ public class FileProjectManager extends ProjectManager {
} }
@Override @Override
public void saveMetadata(ProjectMetadata metadata, long projectId) throws Exception { public void saveMetadata(IMetadata metadata, long projectId) throws Exception {
File projectDir = getProjectDir(projectId); File projectDir = getProjectDir(projectId);
ProjectMetadataUtilities.save(metadata, projectDir);
if (metadata.getFormatName() == MetadataFormat.PROJECT_METADATA) {
Project project = ProjectManager.singleton.getProject(projectId);
((ProjectMetadata)metadata).setRowCount(project.rows.size());
ProjectMetadataUtilities.save(metadata, projectDir);
} else if (metadata.getFormatName() == MetadataFormat.DATAPACKAGE_METADATA) {
DataPackageMetadata dp = (DataPackageMetadata)metadata;
dp.writeToFile(new File(projectDir, DataPackageMetadata.DEFAULT_FILE_NAME));
}
logger.info("metadata saved in " + metadata.getFormatName());
} }
@Override @Override
@ -320,8 +332,6 @@ public class FileProjectManager extends ProjectManager {
return saveWasNeeded; return saveWasNeeded;
} }
@Override @Override
public void deleteProject(long projectID) { public void deleteProject(long projectID) {
synchronized (this) { synchronized (this) {
@ -363,8 +373,6 @@ public class FileProjectManager extends ProjectManager {
protected boolean loadFromFile(File file) { protected boolean loadFromFile(File file) {
logger.info("Loading workspace: {}", file.getAbsolutePath()); logger.info("Loading workspace: {}", file.getAbsolutePath());
_projectsMetadata.clear();
boolean found = false; boolean found = false;
if (file.exists() || file.canRead()) { if (file.exists() || file.canRead()) {
@ -464,4 +472,4 @@ public class FileProjectManager extends ProjectManager {
public HistoryEntryManager getHistoryEntryManager(){ public HistoryEntryManager getHistoryEntryManager(){
return new FileHistoryEntryManager(); return new FileHistoryEntryManager();
} }
} }

View File

@ -35,7 +35,6 @@ package com.google.refine.io;
import java.io.File; import java.io.File;
import java.io.FileOutputStream; import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.IOException; import java.io.IOException;
import java.io.OutputStreamWriter; import java.io.OutputStreamWriter;
import java.io.Writer; import java.io.Writer;
@ -44,27 +43,25 @@ import java.time.LocalDateTime;
import java.time.ZoneId; import java.time.ZoneId;
import java.util.List; import java.util.List;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONTokener;
import org.json.JSONWriter; import org.json.JSONWriter;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectMetadata;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.ProjectMetadata;
public class ProjectMetadataUtilities { public class ProjectMetadataUtilities {
final static Logger logger = LoggerFactory.getLogger("project_metadata_utilities"); final static Logger logger = LoggerFactory.getLogger("project_metadata_utilities");
public static void save(ProjectMetadata projectMeta, File projectDir) throws JSONException, IOException { public static void save(IMetadata projectMeta, File projectDir) throws JSONException, IOException {
File tempFile = new File(projectDir, "metadata.temp.json"); File tempFile = new File(projectDir, ProjectMetadata.TEMP_FILE_NAME);
saveToFile(projectMeta, tempFile); saveToFile(projectMeta, tempFile);
File file = new File(projectDir, "metadata.json"); File file = new File(projectDir, ProjectMetadata.DEFAULT_FILE_NAME);
File oldFile = new File(projectDir, "metadata.old.json"); File oldFile = new File(projectDir, ProjectMetadata.OLD_FILE_NAME);
if (oldFile.exists()) { if (oldFile.exists()) {
oldFile.delete(); oldFile.delete();
@ -76,12 +73,16 @@ public class ProjectMetadataUtilities {
tempFile.renameTo(file); tempFile.renameTo(file);
} }
public static void saveTableSchema(Project project, File projectDir) throws JSONException, IOException {
protected static void saveToFile(ProjectMetadata projectMeta, File metadataFile) throws JSONException, IOException { }
protected static void saveToFile(IMetadata projectMeta, File metadataFile) throws JSONException, IOException {
Writer writer = new OutputStreamWriter(new FileOutputStream(metadataFile)); Writer writer = new OutputStreamWriter(new FileOutputStream(metadataFile));
try { try {
JSONWriter jsonWriter = new JSONWriter(writer); JSONWriter jsonWriter = new JSONWriter(writer);
projectMeta.write(jsonWriter); projectMeta.write(jsonWriter, false);
} finally { } finally {
writer.close(); writer.close();
} }
@ -89,17 +90,17 @@ public class ProjectMetadataUtilities {
static public ProjectMetadata load(File projectDir) { static public ProjectMetadata load(File projectDir) {
try { try {
return loadFromFile(new File(projectDir, "metadata.json")); return loadFromFile(new File(projectDir, ProjectMetadata.DEFAULT_FILE_NAME));
} catch (Exception e) { } catch (Exception e) {
} }
try { try {
return loadFromFile(new File(projectDir, "metadata.temp.json")); return loadFromFile(new File(projectDir, ProjectMetadata.TEMP_FILE_NAME));
} catch (Exception e) { } catch (Exception e) {
} }
try { try {
return loadFromFile(new File(projectDir, "metadata.old.json")); return loadFromFile(new File(projectDir, ProjectMetadata.OLD_FILE_NAME));
} catch (Exception e) { } catch (Exception e) {
} }
@ -148,14 +149,8 @@ public class ProjectMetadataUtilities {
} }
static protected ProjectMetadata loadFromFile(File metadataFile) throws Exception { static protected ProjectMetadata loadFromFile(File metadataFile) throws Exception {
FileReader reader = new FileReader(metadataFile); ProjectMetadata projectMetaData = new ProjectMetadata();
try { projectMetaData.loadFromFile(metadataFile);
JSONTokener tokener = new JSONTokener(reader); return projectMetaData;
JSONObject obj = (JSONObject) tokener.nextValue();
return ProjectMetadata.loadFromJSON(obj);
} finally {
reader.close();
}
} }
} }

View File

@ -36,6 +36,8 @@ package com.google.refine.io;
import java.io.File; import java.io.File;
import java.io.FileOutputStream; import java.io.FileOutputStream;
import java.io.IOException; import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.zip.ZipEntry; import java.util.zip.ZipEntry;
import java.util.zip.ZipFile; import java.util.zip.ZipFile;
import java.util.zip.ZipOutputStream; import java.util.zip.ZipOutputStream;
@ -45,6 +47,9 @@ import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.model.Project; import com.google.refine.model.Project;
import com.google.refine.model.medadata.DataPackageMetadata;
import com.google.refine.model.medadata.IMetadata;
import com.google.refine.model.medadata.MetadataFormat;
import com.google.refine.util.Pool; import com.google.refine.util.Pool;
@ -110,38 +115,63 @@ public class ProjectUtilities {
out.close(); out.close();
} }
} }
static public Project load(File dir, long id) { static public Project loadDataFile(File dir, String dataFile, long id) {
try { try {
File file = new File(dir, "data.zip"); File file = new File(dir, dataFile);
if (file.exists()) { if (file.exists()) {
return loadFromFile(file, id); return loadFromFile(file, id);
} }
} catch (Exception e) { } catch (Exception e) {
e.printStackTrace(); e.printStackTrace();
} }
try {
File file = new File(dir, "data.temp.zip");
if (file.exists()) {
return loadFromFile(file, id);
}
} catch (Exception e) {
e.printStackTrace();
}
try {
File file = new File(dir, "data.old.zip");
if (file.exists()) {
return loadFromFile(file, id);
}
} catch (Exception e) {
e.printStackTrace();
}
return null; return null;
} }
static public Project load(File dir, long id) {
Project project =null;
if ((project = loadDataFile(dir, "data.zip", id)) == null) {
if ((project = loadDataFile(dir, "data.temp.zip", id)) == null) {
project = loadDataFile(dir, "data.old.zip", id);
}
}
return project;
}
/**
* scan the folder for json files and read them as metadata
* @param dir
* @param project
*/
public static Map<MetadataFormat, IMetadata> retriveMetadata(File dir) {
// load the metadatas from data folder.
Map<MetadataFormat, IMetadata> metadataMap = new HashMap<MetadataFormat, IMetadata>();
File[] jsons = dir.listFiles(
(folder, file) -> {
return file.toLowerCase().endsWith(".json");
}
);
for (File file : jsons) {
// already loaded
if (file.getName().startsWith("metadata."))
continue;
DataPackageMetadata metadata = new DataPackageMetadata();
// load itself
metadata.loadFromFile(file);
metadataMap.put(MetadataFormat.DATAPACKAGE_METADATA, metadata);
}
return metadataMap;
}
static protected Project loadFromFile( static protected Project loadFromFile(
File file, File file,
long id long id

View File

@ -34,10 +34,12 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package com.google.refine.model; package com.google.refine.model;
import java.io.Writer; import java.io.Writer;
import java.lang.reflect.Method;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
import java.util.Properties; import java.util.Properties;
import org.apache.commons.lang3.StringUtils;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONObject; import org.json.JSONObject;
import org.json.JSONWriter; import org.json.JSONWriter;
@ -45,8 +47,14 @@ import org.json.JSONWriter;
import com.google.refine.InterProjectModel; import com.google.refine.InterProjectModel;
import com.google.refine.Jsonizable; import com.google.refine.Jsonizable;
import com.google.refine.model.recon.ReconConfig; import com.google.refine.model.recon.ReconConfig;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;
import io.frictionlessdata.tableschema.Field;
import io.frictionlessdata.tableschema.TypeInferrer;
import io.frictionlessdata.tableschema.exceptions.ConstraintsException;
import io.frictionlessdata.tableschema.exceptions.InvalidCastException;
public class Column implements Jsonizable { public class Column implements Jsonizable {
final private int _cellIndex; final private int _cellIndex;
final private String _originalName; final private String _originalName;
@ -54,6 +62,13 @@ public class Column implements Jsonizable {
private ReconConfig _reconConfig; private ReconConfig _reconConfig;
private ReconStats _reconStats; private ReconStats _reconStats;
// from data package metadata Field.java:
private String type = "";
private String format = Field.FIELD_FORMAT_DEFAULT;
private String title = "";
private String description = "";
private Map<String, Object> constraints = null;
transient protected Map<String, Object> _precomputes; transient protected Map<String, Object> _precomputes;
public Column(int cellIndex, String originalName) { public Column(int cellIndex, String originalName) {
@ -101,6 +116,11 @@ public class Column implements Jsonizable {
writer.key("cellIndex"); writer.value(_cellIndex); writer.key("cellIndex"); writer.value(_cellIndex);
writer.key("originalName"); writer.value(_originalName); writer.key("originalName"); writer.value(_originalName);
writer.key("name"); writer.value(_name); writer.key("name"); writer.value(_name);
writer.key("type"); writer.value(type);
writer.key("format"); writer.value(format);
writer.key("title"); writer.value(title);
writer.key("description"); writer.value(description);
writer.key("constraints"); writer.value(new JSONObject(constraints).toString());
if (_reconConfig != null) { if (_reconConfig != null) {
writer.key("reconConfig"); writer.key("reconConfig");
_reconConfig.write(writer, options); _reconConfig.write(writer, options);
@ -140,6 +160,56 @@ public class Column implements Jsonizable {
_precomputes.put(key, value); _precomputes.put(key, value);
} }
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String getFormat() {
return format;
}
public void setFormat(String format) {
this.format = format;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Map<String, Object> getConstraints() {
return constraints;
}
public void setConstraints(Map<String, Object> constraints) {
this.constraints = constraints;
}
public void save(Writer writer) { public void save(Writer writer) {
JSONWriter jsonWriter = new JSONWriter(writer); JSONWriter jsonWriter = new JSONWriter(writer);
try { try {
@ -154,6 +224,14 @@ public class Column implements Jsonizable {
Column column = new Column(obj.getInt("cellIndex"), obj.getString("originalName")); Column column = new Column(obj.getInt("cellIndex"), obj.getString("originalName"));
column._name = obj.getString("name"); column._name = obj.getString("name");
column.type = JSONUtilities.getString(obj, Field.JSON_KEY_TYPE, StringUtils.EMPTY);
column.format = JSONUtilities.getString(obj, Field.JSON_KEY_FORMAT, StringUtils.EMPTY);
column.title = JSONUtilities.getString(obj, Field.JSON_KEY_TITLE, StringUtils.EMPTY);
column.description = JSONUtilities.getString(obj, Field.JSON_KEY_DESCRIPTION, StringUtils.EMPTY);
if (obj.has(Field.JSON_KEY_CONSTRAINTS)) {
column.constraints = new JSONObject(obj.getString(Field.JSON_KEY_CONSTRAINTS)).toMap();
}
if (obj.has("reconConfig")) { if (obj.has("reconConfig")) {
column._reconConfig = ReconConfig.reconstruct(obj.getJSONObject("reconConfig")); column._reconConfig = ReconConfig.reconstruct(obj.getJSONObject("reconConfig"));
} }
@ -168,4 +246,23 @@ public class Column implements Jsonizable {
public String toString() { public String toString() {
return _name; return _name;
} }
public <Any> Any castValue(String value)
throws InvalidCastException, ConstraintsException {
if (this.type.isEmpty()) {
throw new InvalidCastException();
} else {
try {
// Using reflection to invoke appropriate type casting method from the
// TypeInferrer class
String castMethodName = "cast" + (this.type.substring(0, 1).toUpperCase() + this.type.substring(1));
Method method = TypeInferrer.class.getMethod(castMethodName, String.class, String.class, Map.class);
Object castValue = method.invoke(TypeInferrer.getInstance(), this.format, value, null);
return (Any) castValue;
} catch (Exception e) {
throw new InvalidCastException();
}
}
}
} }

View File

@ -55,9 +55,9 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.ProjectManager; import com.google.refine.ProjectManager;
import com.google.refine.ProjectMetadata;
import com.google.refine.RefineServlet; import com.google.refine.RefineServlet;
import com.google.refine.history.History; import com.google.refine.history.History;
import com.google.refine.model.medadata.ProjectMetadata;
import com.google.refine.process.ProcessManager; import com.google.refine.process.ProcessManager;
import com.google.refine.util.ParsingUtilities; import com.google.refine.util.ParsingUtilities;
import com.google.refine.util.Pool; import com.google.refine.util.Pool;
@ -77,14 +77,13 @@ public class Project {
transient private LocalDateTime _lastSave = LocalDateTime.now(); transient private LocalDateTime _lastSave = LocalDateTime.now();
final static Logger logger = LoggerFactory.getLogger("project"); final static Logger logger = LoggerFactory.getLogger("project");
static public long generateID() { static public long generateID() {
return System.currentTimeMillis() + Math.round(Math.random() * 1000000000000L); return System.currentTimeMillis() + Math.round(Math.random() * 1000000000000L);
} }
public Project() { public Project() {
id = generateID(); this(generateID());
history = new History(this);
} }
protected Project(long id) { protected Project(long id) {
@ -121,10 +120,6 @@ public class Project {
this._lastSave = LocalDateTime.now(); this._lastSave = LocalDateTime.now();
} }
public ProjectMetadata getMetadata() {
return ProjectManager.singleton.getProjectMetadata(id);
}
public void saveToOutputStream(OutputStream out, Pool pool) throws IOException { public void saveToOutputStream(OutputStream out, Pool pool) throws IOException {
for (OverlayModel overlayModel : overlayModels.values()) { for (OverlayModel overlayModel : overlayModels.values()) {
try { try {
@ -258,11 +253,14 @@ public class Project {
columnModel.update(); columnModel.update();
recordModel.update(this); recordModel.update(this);
} }
//wrapper of processManager variable to allow unit testing //wrapper of processManager variable to allow unit testing
//TODO make the processManager variable private, and force all calls through this method //TODO make the processManager variable private, and force all calls through this method
public ProcessManager getProcessManager() { public ProcessManager getProcessManager() {
return this.processManager; return this.processManager;
} }
public ProjectMetadata getMetadata() {
return ProjectManager.singleton.getProjectMetadata(id);
}
} }

View File

@ -1,303 +1,303 @@
/* /*
Copyright 2010, Google Inc. Copyright 2010, Google Inc.
All rights reserved. All rights reserved.
Redistribution and use in source and binary forms, with or without Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are modification, are permitted provided that the following conditions are
met: met:
* Redistributions of source code must retain the above copyright * Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer. notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above * Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following disclaimer copyright notice, this list of conditions and the following disclaimer
in the documentation and/or other materials provided with the in the documentation and/or other materials provided with the
distribution. distribution.
* Neither the name of Google Inc. nor the names of its * Neither the name of Google Inc. nor the names of its
contributors may be used to endorse or promote products derived from contributors may be used to endorse or promote products derived from
this software without specific prior written permission. this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/ */
package com.google.refine.model; package com.google.refine.model;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.Comparator; import java.util.Comparator;
import java.util.List; import java.util.List;
import java.util.Properties; import java.util.Properties;
import org.json.JSONException; import org.json.JSONException;
import org.json.JSONWriter; import org.json.JSONWriter;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import com.google.refine.Jsonizable; import com.google.refine.Jsonizable;
import com.google.refine.expr.ExpressionUtils; import com.google.refine.expr.ExpressionUtils;
public class RecordModel implements Jsonizable { public class RecordModel implements Jsonizable {
final static Logger logger = LoggerFactory.getLogger("RecordModel"); final static Logger logger = LoggerFactory.getLogger("RecordModel");
final static public class CellDependency { final static public class CellDependency {
final public int rowIndex; final public int rowIndex;
final public int cellIndex; final public int cellIndex;
public CellDependency(int rowIndex, int cellIndex) { public CellDependency(int rowIndex, int cellIndex) {
this.rowIndex = rowIndex; this.rowIndex = rowIndex;
this.cellIndex = cellIndex; this.cellIndex = cellIndex;
} }
@Override @Override
public String toString() { public String toString() {
return rowIndex+","+cellIndex; return rowIndex+","+cellIndex;
} }
} }
final static public class RowDependency { final static public class RowDependency {
public int recordIndex; public int recordIndex;
public CellDependency[] cellDependencies; public CellDependency[] cellDependencies;
public List<Integer> contextRows; public List<Integer> contextRows;
@Override @Override
public String toString() { public String toString() {
return "Idx: "+recordIndex+" CellDeps: "+Arrays.toString(cellDependencies)+" Rows:"+contextRows; return "Idx: "+recordIndex+" CellDeps: "+Arrays.toString(cellDependencies)+" Rows:"+contextRows;
} }
} }
protected List<RowDependency> _rowDependencies; protected List<RowDependency> _rowDependencies;
protected List<Record> _records; protected List<Record> _records;
public RowDependency getRowDependency(int rowIndex) { public RowDependency getRowDependency(int rowIndex) {
return _rowDependencies != null && rowIndex >= 0 && rowIndex < _rowDependencies.size() ? return _rowDependencies != null && rowIndex >= 0 && rowIndex < _rowDependencies.size() ?
_rowDependencies.get(rowIndex) : null; _rowDependencies.get(rowIndex) : null;
} }
public int getRecordCount() { public int getRecordCount() {
return _records.size(); return _records.size();
} }
public Record getRecord(int recordIndex) { public Record getRecord(int recordIndex) {
return _records != null && recordIndex >= 0 && recordIndex < _records.size() ? return _records != null && recordIndex >= 0 && recordIndex < _records.size() ?
_records.get(recordIndex) : null; _records.get(recordIndex) : null;
} }
public Record getRecordOfRow(int rowIndex) { public Record getRecordOfRow(int rowIndex) {
RowDependency rd = getRowDependency(rowIndex); RowDependency rd = getRowDependency(rowIndex);
if (rd != null) { if (rd != null) {
if (rd.recordIndex < 0) { if (rd.recordIndex < 0) {
rd = getRowDependency(rd.contextRows.get(0)); rd = getRowDependency(rd.contextRows.get(0));
} }
return getRecord(rd.recordIndex); return getRecord(rd.recordIndex);
} }
return null; return null;
} }
@Override @Override
synchronized public void write(JSONWriter writer, Properties options) synchronized public void write(JSONWriter writer, Properties options)
throws JSONException { throws JSONException {
writer.object(); writer.object();
writer.key("hasRecords"); writer.key("hasRecords");
writer.value( writer.value(
_records != null && _rowDependencies != null && _records != null && _rowDependencies != null &&
_records.size() < _rowDependencies.size()); _records.size() < _rowDependencies.size());
writer.endObject(); writer.endObject();
} }
static protected class KeyedGroup { static protected class KeyedGroup {
int[] cellIndices; int[] cellIndices;
int keyCellIndex; int keyCellIndex;
@Override @Override
public String toString() { public String toString() {
StringBuffer sb = new StringBuffer(); StringBuffer sb = new StringBuffer();
for (int i:cellIndices) { for (int i:cellIndices) {
sb.append(i).append(','); sb.append(i).append(',');
} }
return "key: " + keyCellIndex + " cells: " + sb.toString(); return "key: " + keyCellIndex + " cells: " + sb.toString();
} }
} }
synchronized public void update(Project project) { synchronized public void update(Project project) {
synchronized (project) { synchronized (project) {
List<Row> rows = project.rows; List<Row> rows = project.rows;
int rowCount = rows.size(); int rowCount = rows.size();
ColumnModel columnModel = project.columnModel; ColumnModel columnModel = project.columnModel;
List<KeyedGroup> keyedGroups = computeKeyedGroups(columnModel); List<KeyedGroup> keyedGroups = computeKeyedGroups(columnModel);
int groupCount = keyedGroups.size(); int groupCount = keyedGroups.size();
int[] lastNonBlankRowsByGroup = new int[keyedGroups.size()]; int[] lastNonBlankRowsByGroup = new int[keyedGroups.size()];
for (int i = 0; i < lastNonBlankRowsByGroup.length; i++) { for (int i = 0; i < lastNonBlankRowsByGroup.length; i++) {
lastNonBlankRowsByGroup[i] = -1; lastNonBlankRowsByGroup[i] = -1;
} }
_rowDependencies = new ArrayList<RowDependency>(rowCount); _rowDependencies = new ArrayList<RowDependency>(rowCount);
int recordIndex = 0; int recordIndex = 0;
for (int r = 0; r < rowCount; r++) { for (int r = 0; r < rowCount; r++) {
Row row = rows.get(r); Row row = rows.get(r);
RowDependency rowDependency = new RowDependency(); RowDependency rowDependency = new RowDependency();
for (int g = 0; g < groupCount; g++) { for (int g = 0; g < groupCount; g++) {
KeyedGroup group = keyedGroups.get(g); KeyedGroup group = keyedGroups.get(g);
if (!ExpressionUtils.isNonBlankData(row.getCellValue(keyedGroups.get(0).keyCellIndex)) && if (!ExpressionUtils.isNonBlankData(row.getCellValue(keyedGroups.get(0).keyCellIndex)) &&
!ExpressionUtils.isNonBlankData(row.getCellValue(group.keyCellIndex))) { !ExpressionUtils.isNonBlankData(row.getCellValue(group.keyCellIndex))) {
int contextRowIndex = lastNonBlankRowsByGroup[g]; int contextRowIndex = lastNonBlankRowsByGroup[g];
if (contextRowIndex >= 0) { if (contextRowIndex >= 0) {
for (int dependentCellIndex : group.cellIndices) { for (int dependentCellIndex : group.cellIndices) {
if (ExpressionUtils.isNonBlankData(row.getCellValue(dependentCellIndex))) { if (ExpressionUtils.isNonBlankData(row.getCellValue(dependentCellIndex))) {
setRowDependency( setRowDependency(
project, project,
rowDependency, rowDependency,
dependentCellIndex, dependentCellIndex,
contextRowIndex, contextRowIndex,
group.keyCellIndex group.keyCellIndex
); );
} }
} }
} }
} else { } else {
lastNonBlankRowsByGroup[g] = r; lastNonBlankRowsByGroup[g] = r;
} }
} }
if (rowDependency.cellDependencies != null && rowDependency.cellDependencies.length > 0) { if (rowDependency.cellDependencies != null && rowDependency.cellDependencies.length > 0) {
rowDependency.recordIndex = -1; rowDependency.recordIndex = -1;
rowDependency.contextRows = new ArrayList<Integer>(); rowDependency.contextRows = new ArrayList<Integer>();
for (CellDependency cd : rowDependency.cellDependencies) { for (CellDependency cd : rowDependency.cellDependencies) {
if (cd != null) { if (cd != null) {
rowDependency.contextRows.add(cd.rowIndex); rowDependency.contextRows.add(cd.rowIndex);
} }
} }
Collections.sort(rowDependency.contextRows); Collections.sort(rowDependency.contextRows);
} else { } else {
rowDependency.recordIndex = recordIndex++; rowDependency.recordIndex = recordIndex++;
} }
_rowDependencies.add(rowDependency); _rowDependencies.add(rowDependency);
} }
_records = new ArrayList<Record>(recordIndex); _records = new ArrayList<Record>(recordIndex);
if (recordIndex > 0) { if (recordIndex > 0) {
recordIndex = 0; recordIndex = 0;
int recordRowIndex = 0; int recordRowIndex = 0;
for (int r = 1; r < rowCount; r++) { for (int r = 1; r < rowCount; r++) {
RowDependency rd = _rowDependencies.get(r); RowDependency rd = _rowDependencies.get(r);
if (rd.recordIndex >= 0) { if (rd.recordIndex >= 0) {
_records.add(new Record(recordRowIndex, r, recordIndex++)); _records.add(new Record(recordRowIndex, r, recordIndex++));
recordIndex = rd.recordIndex; recordIndex = rd.recordIndex;
recordRowIndex = r; recordRowIndex = r;
} }
} }
_records.add(new Record(recordRowIndex, rowCount, recordIndex++)); _records.add(new Record(recordRowIndex, rowCount, recordIndex++));
} }
} }
} }
protected List<KeyedGroup> computeKeyedGroups(ColumnModel columnModel) { protected List<KeyedGroup> computeKeyedGroups(ColumnModel columnModel) {
List<KeyedGroup> keyedGroups = new ArrayList<KeyedGroup>(); List<KeyedGroup> keyedGroups = new ArrayList<KeyedGroup>();
addRootKeyedGroup(columnModel, keyedGroups); addRootKeyedGroup(columnModel, keyedGroups);
for (ColumnGroup group : columnModel.columnGroups) { for (ColumnGroup group : columnModel.columnGroups) {
if (group.keyColumnIndex >= 0) { if (group.keyColumnIndex >= 0) {
KeyedGroup keyedGroup = new KeyedGroup(); KeyedGroup keyedGroup = new KeyedGroup();
keyedGroup.keyCellIndex = columnModel.columns.get(group.keyColumnIndex).getCellIndex(); keyedGroup.keyCellIndex = columnModel.columns.get(group.keyColumnIndex).getCellIndex();
keyedGroup.cellIndices = new int[group.columnSpan - 1]; keyedGroup.cellIndices = new int[group.columnSpan - 1];
int c = 0; int c = 0;
for (int i = 0; i < group.columnSpan; i++) { for (int i = 0; i < group.columnSpan; i++) {
int columnIndex = group.startColumnIndex + i; int columnIndex = group.startColumnIndex + i;
if (columnIndex != group.keyColumnIndex && columnIndex < columnModel.columns.size()) { if (columnIndex != group.keyColumnIndex && columnIndex < columnModel.columns.size()) {
int cellIndex = columnModel.columns.get(columnIndex).getCellIndex(); int cellIndex = columnModel.columns.get(columnIndex).getCellIndex();
keyedGroup.cellIndices[c++] = cellIndex; keyedGroup.cellIndices[c++] = cellIndex;
} }
} }
keyedGroups.add(keyedGroup); keyedGroups.add(keyedGroup);
} }
} }
Collections.sort(keyedGroups, new Comparator<KeyedGroup>() { Collections.sort(keyedGroups, new Comparator<KeyedGroup>() {
@Override @Override
public int compare(KeyedGroup o1, KeyedGroup o2) { public int compare(KeyedGroup o1, KeyedGroup o2) {
return o2.cellIndices.length - o1.cellIndices.length; // larger groups first return o2.cellIndices.length - o1.cellIndices.length; // larger groups first
} }
}); });
dumpKeyedGroups(keyedGroups, columnModel); // for debug dumpKeyedGroups(keyedGroups, columnModel); // for debug
return keyedGroups; return keyedGroups;
} }
// debugging helper // debugging helper
private void dumpKeyedGroups(List<KeyedGroup> groups, ColumnModel columnModel) { private void dumpKeyedGroups(List<KeyedGroup> groups, ColumnModel columnModel) {
for (KeyedGroup g : groups) { for (KeyedGroup g : groups) {
String keyColName = columnModel.getColumnByCellIndex(g.keyCellIndex).getName(); String keyColName = columnModel.getColumnByCellIndex(g.keyCellIndex).getName();
StringBuffer sb = new StringBuffer(); StringBuffer sb = new StringBuffer();
for (int ci : g.cellIndices) { for (int ci : g.cellIndices) {
Column col = columnModel.getColumnByCellIndex(ci); Column col = columnModel.getColumnByCellIndex(ci);
if (col != null) { if (col != null) {
// Old projects have col 0 slot empty // Old projects have col 0 slot empty
sb.append(col.getName()).append(','); sb.append(col.getName()).append(',');
} }
} }
logger.trace("KeyedGroup " + keyColName + "::" + sb.toString()); logger.trace("KeyedGroup " + keyColName + "::" + sb.toString());
} }
} }
protected void addRootKeyedGroup(ColumnModel columnModel, List<KeyedGroup> keyedGroups) { protected void addRootKeyedGroup(ColumnModel columnModel, List<KeyedGroup> keyedGroups) {
int count = columnModel.getMaxCellIndex() + 1; int count = columnModel.getMaxCellIndex() + 1;
if (count > 0 && columnModel.getKeyColumnIndex() < columnModel.columns.size()) { if (count > 0 && columnModel.getKeyColumnIndex() < columnModel.columns.size()) {
KeyedGroup rootKeyedGroup = new KeyedGroup(); KeyedGroup rootKeyedGroup = new KeyedGroup();
rootKeyedGroup.cellIndices = new int[count - 1]; rootKeyedGroup.cellIndices = new int[count - 1];
rootKeyedGroup.keyCellIndex = columnModel.columns.get(columnModel.getKeyColumnIndex()).getCellIndex(); rootKeyedGroup.keyCellIndex = columnModel.columns.get(columnModel.getKeyColumnIndex()).getCellIndex();
for (int i = 0; i < count; i++) { for (int i = 0; i < count; i++) {
if (i < rootKeyedGroup.keyCellIndex) { if (i < rootKeyedGroup.keyCellIndex) {
rootKeyedGroup.cellIndices[i] = i; rootKeyedGroup.cellIndices[i] = i;
} else if (i > rootKeyedGroup.keyCellIndex) { } else if (i > rootKeyedGroup.keyCellIndex) {
rootKeyedGroup.cellIndices[i - 1] = i; rootKeyedGroup.cellIndices[i - 1] = i;
} }
} }
keyedGroups.add(rootKeyedGroup); keyedGroups.add(rootKeyedGroup);
} }
} }
protected void setRowDependency( protected void setRowDependency(
Project project, Project project,
RowDependency rowDependency, RowDependency rowDependency,
int cellIndex, int cellIndex,
int contextRowIndex, int contextRowIndex,
int contextCellIndex int contextCellIndex
) { ) {
if (rowDependency.cellDependencies == null) { if (rowDependency.cellDependencies == null) {
int count = project.columnModel.getMaxCellIndex() + 1; int count = project.columnModel.getMaxCellIndex() + 1;
rowDependency.cellDependencies = new CellDependency[count]; rowDependency.cellDependencies = new CellDependency[count];
} }
rowDependency.cellDependencies[cellIndex] = rowDependency.cellDependencies[cellIndex] =
new CellDependency(contextRowIndex, contextCellIndex); new CellDependency(contextRowIndex, contextCellIndex);
} }
} }

View File

@ -62,6 +62,21 @@ public class ColumnAdditionChange extends ColumnChange {
newCells.toArray(_newCells); newCells.toArray(_newCells);
} }
public String getColumnName() {
return _columnName;
}
public int getColumnIndex() {
return _columnIndex;
}
public int getNewCellIndex() {
return _newCellIndex;
}
@Override @Override
public void apply(Project project) { public void apply(Project project) {
synchronized (project) { synchronized (project) {

View File

@ -58,6 +58,18 @@ public class ColumnMoveChange extends ColumnChange {
_newColumnIndex = index; _newColumnIndex = index;
} }
public int getOldColumnIndex() {
return _oldColumnIndex;
}
public String getColumnName() {
return _columnName;
}
public int getNewColumnIndex() {
return _newColumnIndex;
}
@Override @Override
public void apply(Project project) { public void apply(Project project) {
synchronized (project) { synchronized (project) {

View File

@ -54,11 +54,15 @@ public class ColumnRemovalChange extends ColumnChange {
protected Column _oldColumn; protected Column _oldColumn;
protected CellAtRow[] _oldCells; protected CellAtRow[] _oldCells;
protected List<ColumnGroup> _oldColumnGroups; protected List<ColumnGroup> _oldColumnGroups;
public ColumnRemovalChange(int index) { public ColumnRemovalChange(int index) {
_oldColumnIndex = index; _oldColumnIndex = index;
} }
public int getOldColumnIndex() {
return _oldColumnIndex;
}
@Override @Override
public void apply(Project project) { public void apply(Project project) {
synchronized (project) { synchronized (project) {

View File

@ -57,6 +57,11 @@ public class ColumnReorderChange extends ColumnChange {
_columnNames = columnNames; _columnNames = columnNames;
} }
public List<String> getColumnNames() {
return _columnNames;
}
@Override @Override
public void apply(Project project) { public void apply(Project project) {
synchronized (project) { synchronized (project) {

View File

@ -54,7 +54,7 @@ import com.google.refine.model.Project;
import com.google.refine.model.Row; import com.google.refine.model.Row;
import com.google.refine.util.Pool; import com.google.refine.util.Pool;
public class ColumnSplitChange implements Change { public class ColumnSplitChange extends ColumnChange {
final protected String _columnName; final protected String _columnName;
final protected List<String> _columnNames; final protected List<String> _columnNames;
@ -118,6 +118,21 @@ public class ColumnSplitChange implements Change {
_newRows = newRows; _newRows = newRows;
} }
public List<String> getColumnNames() {
return _columnNames;
}
public boolean isRemoveOriginalColumn() {
return _removeOriginalColumn;
}
public int getColumnIndex() {
return _columnIndex;
}
@Override @Override
public void apply(Project project) { public void apply(Project project) {
synchronized (project) { synchronized (project) {

View File

@ -0,0 +1,73 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.time.LocalDateTime;
import java.util.Properties;
import org.apache.commons.beanutils.PropertyUtils;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
public abstract class AbstractMetadata implements IMetadata {
private MetadataFormat formatName = MetadataFormat.UNKNOWN;
protected LocalDateTime written = null;
protected LocalDateTime _modified;
public MetadataFormat getFormatName() {
return formatName;
}
public void setFormatName(MetadataFormat formatName) {
this.formatName = formatName;
}
@Override
public abstract void loadFromJSON(JSONObject obj);
@Override
public abstract void loadFromFile(File metadataFile);
@Override
public abstract void writeToFile(File metadataFile);
@Override
public boolean isDirty() {
return written == null || _modified.isAfter(written);
}
@Override
public LocalDateTime getModified() {
return _modified;
}
@Override
public void updateModified() {
_modified = LocalDateTime.now();
}
/**
* @param jsonWriter
* writer to save metadatea to
* @param onlyIfDirty
* true to not write unchanged metadata
* @throws JSONException
*/
@Override
public void write(JSONWriter jsonWriter, boolean onlyIfDirty) throws JSONException {
if (!onlyIfDirty || isDirty()) {
Properties options = new Properties();
options.setProperty("mode", "save");
write(jsonWriter, options);
}
}
protected static boolean propertyExists(Object bean, String property) {
return PropertyUtils.isReadable(bean, property) &&
PropertyUtils.isWriteable(bean, property);
}
}

View File

@ -0,0 +1,130 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.everit.json.schema.ValidationException;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.datapackage.Resource;
import io.frictionlessdata.datapackage.exceptions.DataPackageException;
public class DataPackageMetadata extends AbstractMetadata {
private final static Logger logger = LoggerFactory.getLogger(DataPackageMetadata.class);
public static final String DEFAULT_FILE_NAME = "datapackage.json";
private Package _pkg;
public DataPackageMetadata() {
setFormatName(MetadataFormat.DATAPACKAGE_METADATA);
_pkg = PackageExtension.buildPackageFromTemplate();
}
@Override
public void loadFromJSON(JSONObject obj) {
try {
_pkg = new Package(obj);
} catch (ValidationException | DataPackageException | IOException e) {
logger.error("Load from JSONObject failed" + obj.toString(4),
ExceptionUtils.getStackTrace(e));
}
logger.info("Data Package metadata loaded");
}
@Override
public void loadFromFile(File metadataFile) {
String jsonString = null;
try {
jsonString = FileUtils.readFileToString(metadataFile);
} catch (IOException e) {
logger.error("Load data package failed when reading from file: " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
}
loadFromJSON(new JSONObject(jsonString));
}
/**
* Write the package to a json file.
*/
@Override
public void writeToFile(File metadataFile) {
try {
this._pkg.save(metadataFile.getAbsolutePath());
} catch (IOException e) {
logger.error("IO exception when writing to file " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
} catch (DataPackageException e) {
logger.error("Data package exception when writing to file " + metadataFile.getAbsolutePath(),
ExceptionUtils.getStackTrace(e));
}
}
@Override
public void write(JSONWriter jsonWriter, Properties options)
throws JSONException {
StringWriter sw = new StringWriter();
_pkg.getJson().write(sw);
jsonWriter = new JSONWriter(sw);
}
@Override
public void loadFromStream(InputStream inputStream) {
try {
this._pkg = new Package(IOUtils.toString(inputStream));
} catch (ValidationException e) {
logger.error("validation failed", ExceptionUtils.getStackTrace(e));
} catch (DataPackageException e) {
logger.error("Data package excpetion when loading from stream", ExceptionUtils.getStackTrace(e));
} catch (IOException e) {
logger.error("IO exception when loading from stream", ExceptionUtils.getStackTrace(e));
}
}
public List<String> getResourcePaths() {
List<String> listResources = new ArrayList<String>();
for (Resource resource : _pkg.getResources()) {
listResources.add((String) resource.getPath());
}
return listResources;
}
@Override
public JSONObject getJSON() {
return _pkg.getJson();
}
public Package getPackage() {
return _pkg;
}
@Override
public List<Exception> validate() {
try {
_pkg.validate();
} catch (ValidationException | IOException | DataPackageException e) {
logger.error("validate json failed", ExceptionUtils.getStackTrace(e));
}
return _pkg.getErrors();
}
}

View File

@ -0,0 +1,44 @@
package com.google.refine.model.medadata;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.ArrayList;
import java.util.List;
import com.google.refine.importing.UrlRewriter;
public class DataPackageUrlRewriter implements UrlRewriter {
@Override
public List<Result> rewrite(String url) throws MalformedURLException, IOException {
List<Result> listResult = new ArrayList<Result>();
if (!filter(url))
return listResult;
listResult.add(new Result(url, "json", true, MetadataFormat.DATAPACKAGE_METADATA.name()));
DataPackageMetadata meta = new DataPackageMetadata();
meta.loadFromStream(new URL(url).openStream());
// Import the data files.
for (String path : meta.getResourcePaths()) {
String fileURL = getBaseURL(url) + "/" + path;
listResult.add(new Result(fileURL,
"", // leave to guesser. "text/line-based/*sv"
true));
}
return listResult;
}
@Override
public boolean filter(String url) {
return url.endsWith(DataPackageMetadata.DEFAULT_FILE_NAME);
}
private String getBaseURL(String url) {
return url.replaceFirst(DataPackageMetadata.DEFAULT_FILE_NAME, "");
}
}

View File

@ -0,0 +1,44 @@
package com.google.refine.model.medadata;
import java.io.File;
import java.io.InputStream;
import java.time.LocalDateTime;
import java.util.List;
import org.json.JSONException;
import org.json.JSONObject;
import org.json.JSONWriter;
import com.google.refine.Jsonizable;
/**
* Interface to import/export metadata
*/
public interface IMetadata extends Jsonizable {
public void loadFromJSON(JSONObject obj);
public void loadFromFile(File metadataFile);
public void loadFromStream(InputStream inputStream);
public void writeToFile(File metadataFile);
/**
* @param jsonWriter writer to save metadatea to
* @param onlyIfDirty true to not write unchanged metadata
* @throws JSONException
*/
public void write(JSONWriter jsonWriter, boolean onlyIfDirty);
public MetadataFormat getFormatName();
public void setFormatName(MetadataFormat format);
public LocalDateTime getModified();
public void updateModified();
public boolean isDirty();
public JSONObject getJSON();
public List<Exception> validate();
}

View File

@ -0,0 +1,82 @@
package com.google.refine.model.medadata;
import java.io.IOException;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.everit.json.schema.ValidationException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.refine.model.Project;
import com.google.refine.util.JSONUtilities;
import com.google.refine.util.ParsingUtilities;
import io.frictionlessdata.datapackage.Package;
import io.frictionlessdata.datapackage.Resource;
import io.frictionlessdata.datapackage.exceptions.DataPackageException;
public class MetadataFactory {
private final static Logger logger = LoggerFactory.getLogger(MetadataFactory.class);
/**
* Build metadata based on the format
* @param format
* @return
*/
public static IMetadata buildMetadata(MetadataFormat format) {
IMetadata metadata = null;
if (format == MetadataFormat.PROJECT_METADATA) {
metadata = new ProjectMetadata();
} else if (format == MetadataFormat.DATAPACKAGE_METADATA) {
metadata = new DataPackageMetadata();
}
return metadata;
}
/**
* build an empty Data Package Metadata
* @return
*/
public static DataPackageMetadata buildDataPackageMetadata() {
return (DataPackageMetadata) buildMetadata(MetadataFormat.DATAPACKAGE_METADATA);
}
/**
* Build an empty data package metadata, then populate the fields from the Project Metadata
* @param project
* @return
*/
public static DataPackageMetadata buildDataPackageMetadata(Project project) {
DataPackageMetadata dpm = buildDataPackageMetadata();
ProjectMetadata pmd = project.getMetadata();
Package pkg = dpm.getPackage();
Resource resource = SchemaExtension.createResource(project.getMetadata().getName(),
project.columnModel);
try {
pkg.addResource(resource);
putValue(pkg, Package.JSON_KEY_NAME, pmd.getName());
putValue(pkg, PackageExtension.JSON_KEY_LAST_UPDATED, ParsingUtilities.localDateToString(pmd.getModified()));
putValue(pkg, PackageExtension.JSON_KEY_DESCRIPTION, pmd.getDescription());
putValue(pkg, PackageExtension.JSON_KEY_TITLE, pmd.getTitle());
putValue(pkg, PackageExtension.JSON_KEY_HOMEPAGE, pmd.getHomepage());
putValue(pkg, PackageExtension.JSON_KEY_IMAGE, pmd.getImage());
putValue(pkg, PackageExtension.JSON_KEY_LICENSE, pmd.getLicense());
pkg.removeProperty(PackageExtension.JSON_KEY_KEYWORKS);
pkg.addProperty(PackageExtension.JSON_KEY_KEYWORKS, JSONUtilities.arrayToJSONArray(pmd.getTags()));
} catch (ValidationException | IOException | DataPackageException e) {
logger.error(ExceptionUtils.getStackTrace(e));
}
return dpm;
}
private static void putValue(Package pkg, String key, String value) throws DataPackageException {
if(pkg.getJson().has(key)) {
pkg.removeProperty(key);
}
pkg.addProperty(key, value);
}
}

Some files were not shown because too many files have changed in this diff Show More