Conflicts:
	src/main/java/org/codelibs/fess/service/PathMappingService.java
This commit is contained in:
Shunji Makino 2015-07-14 17:29:36 +09:00
commit 4492e4a145
710 changed files with 108446 additions and 15425 deletions

View file

@ -0,0 +1,9 @@
@echo off
set ANT_OPTS=-Xmx512m
set DBFLUTE_HOME=..\mydbflute\dbflute-1.1.0-sp1
set MY_PROPERTIES_PATH=build.properties
if "%pause_at_end%"=="" set pause_at_end=y

7
dbflute_fess/_project.sh Normal file
View file

@ -0,0 +1,7 @@
#!/bin/bash
export ANT_OPTS=-Xmx512m
export DBFLUTE_HOME=../mydbflute/dbflute-1.1.0-sp1
export MY_PROPERTIES_PATH=build.properties

72
dbflute_fess/_readme.txt Normal file
View file

@ -0,0 +1,72 @@
Directory for DBFlute client
manage.bat(sh) => 21 (jdbc):
A execution command of JDBC task
which gets your schema info and saves it to SchemaXML
located to the "schema" directory.
This task should be executed after ReplaceSchema task
and before other tasks(e.g. Generate, Document task).
manage.bat(sh) => 22 (doc):
A execution command of Document task
which creates documents, for example, SchemaHTML, HistoryHTML
to the "output/doc" directory.
manage.bat(sh) => 23 (generate):
A execution command of Generate task
which generates classes corresponding your tables,
for example, entities, condition-beans to specified
directories by DBFlute properties on "dfprop" directory.
Generated structures (directories and classes) are like this:
/- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
allcommon : classes bridging to DBFlute Runtime
bsbhv : base behaviors
bsentity : base entities
cbean : condition-beans (both base and extended)
exbhv : extended behaviors
exentity : extended entities
- - - - - - - - - -/
For example, if a table called "MEMBER" exists,
you can use these classes like this:
/- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
memberBhv.selectEntity(cb -> {
cb.query().setMemberId_Equal(3);
}).alwaysPresent(member -> {
... = member.getMemberName();
});
// memberBhv : Behavior (instance)
// MemberCB(cb) : ConditionBean
// Member(member) : Entity
- - - - - - - - - -/
manage.bat(sh) => 24 (sql2entity):
A execution command of Sql2Entity task
which generates classes corresponding your outside-SQL files,
for example, entities, parameter-beans to specified
directories by DBFlute properties on "dfprop" directory.
manage.bat(sh) => 0 (replace-schema):
A execution command of ReplaceSchema task
which creates your tables and loads data by
resources located to the "playsql" directory.
manage.bat(sh) => 25 (outside-sql-test):
A execution command of OutsideSqlTest task
which executes outside-SQL files and you can check
whether the SQLs have correct formats.
The directories are for DBFlute tasks:
/- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
dfprop : Directory for DBFlute properties
extlib : Directory for Directory for library extension
log : Directory for log files of DBFlute tasks
output/doc : Directory for auto-generated documents
playsql : Directory for ReplaceSchema task
schema : Directory for files of schema info
- - - - - - - - - -/
The files, _project.bat, _project.sh, build.properties
are for internal processes of DBFlute tasks so basically
you don't need to touch them.

View file

@ -0,0 +1,5 @@
# -------------------------------------------------------------------
# P R O J E C T
# -------------------------------------------------------------------
torque.project = fess

View file

@ -0,0 +1,35 @@
Directory for DBFlute properties
Required (Basic) Properties:
o basicInfoMap.dfprop
o databaseInfoMap.dfprop
At first, you should set these properties
before executions of DBFlute tasks.
Properties for additional informations:
o additionalForeignKeyMap.dfprop
o additionalPrimaryKeyMap.dfprop
o additionalUniqueKeyMap.dfprop
o additionalTableMap.dfprop
Properties for implementation environments:
o commonColumnMap.dfprop
o classificationDefinitionMap.dfprop
o classificationDeploymentMap.dfprop
o optimisticLockDefinitionMap.dfprop
o outsideSqlDefinitionMap.dfprop
o sequenceDefinitionMap.dfprop
o dependencyInjectionMap.dfprop
o littleAdjustmentMap.dfprop
o includeQueryMap.dfprop
o typeMappingMap.dfprop
Properties for ReplaceSchema:
o replaceSchemaDefinitionMap.dfprop
Properties for documents:
o documentDefinitionMap.dfprop
Properties for non-functional adjustments:
o allClassCopyright.dfprop
o refreshDefinitionMap.dfprop

View file

@ -0,0 +1,42 @@
# /---------------------------------------------------------------------------
# additionalForeignKeyMap: (NotRequired - Default map:{})
#
# If foreign key does not exist in your database,
# you can set up here as virtual foreign key for DBFlute.
#
# And it's one-to-one relation if you add one fixed condition to referrer table,
# you can set virtual foreign key with fixedCondition and fixedSuffix.
# And you can use it to view objects too.
#
# If local column name is same as foreign column name,
# you can omit the setting of localColumnName and foreignColumnName.
# The names are treated as case insensitive.
#
# Example:
# map:{
# ; FK_MEMBER_MEMBER_STATUS_CODE = map:{
# ; localTableName = MEMBER ; foreignTableName = MEMBER_STATUS
# ; localColumnName = MEMBER_STATUS_CODE ; foreignColumnName = MEMBER_STATUS_CODE
# }
# ; FK_PURCHASE_MEMBER_ID = map:{
# ; localTableName = PURCHASE ; foreignTableName = MEMBER
# }
# ; FK_MEMBER_MEMBER_ADDRESS_VALID = map:{
# ; localTableName = MEMBER ; foreignTableName = MEMBER_ADDRESS
# ; localColumnName = MEMBER_ID ; foreignColumnName = MEMBER_ID
# ; fixedCondition =
# $$foreignAlias$$.VALID_BEGIN_DATE <= /*targetDate(Date)*/null
# and $$foreignAlias$$.VALID_END_DATE >= /*targetDate(Date)*/null
# ; fixedSuffix = AsValid
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; FK_MEMBER_MEMBER_STATUS_CODE = map:{
# ; localTableName = MEMBER ; foreignTableName = MEMBER_STATUS
# ; localColumnName = MEMBER_STATUS_CODE ; foreignColumnName = MEMBER_STATUS_CODE
#}
}
# ----------------/

View file

@ -0,0 +1,26 @@
# /---------------------------------------------------------------------------
# additionalPrimaryKeyMap: (NotRequired - Default map:{})
#
# If primary key does not exist in your database,
# you can set up here as virtual primary key for DBFlute.
# And you can use it to view objects too.
# The names are treated as case insensitive.
#
# Example:
# map:{
# ; PK_MEMBER = map:{
# ; tableName = MEMBER ; columnName = MEMBER_ID
# }
# ; PK_PURCHASE = map:{
# ; tableName = PURCHASE ; columnName = PURCHASE_ID
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; PK_MEMBER = map:{
# ; tableName = MEMBER ; columnName = MEMBER_ID
#}
}
# ----------------/

View file

@ -0,0 +1,42 @@
# /---------------------------------------------------------------------------
# additionalTableMap: (NotRequired - Default map:{})
#
# This property is valid at only JDBC task.
# You should use this when JDBC can not provide table information
# and when you have no table but call stored procedures only.
#
# The element 'columnMap' is only required in table elements.
# The element 'type' is only required in column elements.
#
# Specification:
# map: {
# [table-name] = map:{
# columnMap = map:{
# [column-name] = map:{
# type = [column JDBC type] ; dbType = [column DB type]
# ; required = [true or false] ; size = [column size]
# ; primaryKey = [true or false] ; pkName = [PK constraint name]
# ; autoIncrement = [true or false]
# ; default = [default value] ; comment = [column comment]
# }
# }
# ; comment = [table comment]
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; FOO_TABLE = map:{
# ; columnMap = map:{
# FOO_ID = map:{ type = INTEGER ; dbType = INTEGER
# ; required = true ; primaryKey = true ; autoIncrement = true
# }
# FOO_NAME = map:{ type = VARCHAR ; required = true ; size = 123 }
# FOO_DATE = map:{ type = DATE }
# }
#}
}
# ----------------/
#
# *Refer to typeMappingMap.dfprop for JDBC type reference.

View file

@ -0,0 +1,26 @@
# /---------------------------------------------------------------------------
# additionalUniqueKeyMap: (NotRequired - Default map:{})
#
# If unique key does not exist in your database,
# you can set up here as virtual unique key for DBFlute.
# And you can use it to view objects too.
# The names are treated as case insensitive.
#
# Example:
# map:{
# ; UQ_MEMBER = map:{
# ; tableName = MEMBER ; columnName = MEMBER_ACCOUNT
# }
# ; UQ_PRODUCT = map:{
# ; tableName = PRODUCT ; columnName = PRODUCT_HANDLE_CODE
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; UQ_MEMBER = map:{
# ; tableName = MEMBER ; columnName = MEMBER_ACCOUNT
#}
}
# ----------------/

View file

@ -0,0 +1,11 @@
# /---------------------------------------------------------------------------
# allClassCopyright: (NotRequired - Default '')
#
# The copyright for all classes.
# This property is NOT map style.
# You should specify before your first generating.
#
#/*
# * Copyright(c) DBFlute TestCo.,TestLtd. All Rights Reserved.
# */
# ----------------/

View file

@ -0,0 +1,228 @@
# /---------------------------------------------------------------------------
# basicInfoMap: (Required)
#
# The basic information for the tasks of DBFlute.
# You should specify before your first generating.
#
# Core Properties:
# o database: (Required)
# o targetLanguage: (Required)
# o targetContainer: (Required)
# o packageBase: (Required)
#
# Adjustment Properties:
# o generateOutputDirectory: (NotRequired - Default Java:'../src/main/java' CSharp:'../source')
# o resourceOutputDirectory: (NotRequired - Default '../resources')
# o isTableNameCamelCase: (NotRequired - Default false)
# o isColumnNameCamelCase: (NotRequired - Default false)
# o projectPrefix: (NotRequired - Default '')
# o classAuthor: (NotRequired - Default 'DBFlute(AutoGenerator)')
# o sourceFileEncoding: (NotRequired - Default 'UTF-8')
# o sourceCodeLineSeparator: (NotRequired - Default no setting)
# o applicationBehaviorMap: (NotRequired - Default map:{})
# o outputPackageAdjustmentMap: (NotRequired - Default map:{})
# o dbfluteSystemFinalTimeZone: (NotRequired - Default null)
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o database: (Required)
# This is the target database, only considered when generating
# the SQL for your DBFlute project.
# Your possible choices are:
#
# mysql, postgresql, oracle, db2, sqlserver,
# h2, derby, (sqlite, firebird, msaccess)
#
; database = h2
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o targetLanguage: (Required)
# The target language.
# Your possible choices are:
#
# java, csharp, scala
#
; targetLanguage = java
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o targetContainer: (Required)
# The target DI container.
# If your target language is 'csharp', you can specify 'seasar' only.
# Your possible choices are:
#
# spring, guice, seasar, cdi
#
; targetContainer = spring
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o packageBase: (Required)
# The base directory of package for generated class.
# The class package is 'this property value + detail package value'.
# However, detail packages have default value so basically
# you only have to specify the property 'packageBase'.
# If this property is specified and detail package properties is not specified,
# Then the packages of generated class are as follows:
#
# e.g. packageBase = org.docksidestage.dbflute
# --> org.docksidestage.dbflute.allcommon
# --> org.docksidestage.dbflute.bsbhv
# --> org.docksidestage.dbflute.bsentity
# --> org.docksidestage.dbflute.cbean
# --> org.docksidestage.dbflute.exbhv
# --> org.docksidestage.dbflute.exentity
#
; packageBase = org.codelibs.fess.db
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o generateOutputDirectory: (NotRequired - Default Java:'../src/main/java' CSharp:'../source')
# The base output directory for generating.
# Basically you don't need to specify this if the project style is as follows:
#
# *Java Project Style
# If this value is '../src/main/java' and your project is under the Maven,
# you don't need to set up this property!
#
# {app-project}
# |
# |-dbflute_[project]
# | |-dfprop
# | |-...
# |
# |-src/main/java // *Here!
# |-src/main/resources
# |-...
#
# *CSharp Project Style
# [app-solution]/dbflute_[project]/dfprop
# [app-solution]/mydbflute/dbflute-0.9.6
# [app-solution]/source/[app-solution].sln
# [app-solution]/source/[app-project(top-namespace)]/[part-namespace]/AllCommon
# [app-solution]/source/[app-project(top-namespace)]/[part-namespace]/BsBhv
# [app-solution]/source/[app-project(top-namespace)]/[part-namespace]/...
# [app-solution]/source/[app-project(top-namespace)]/Resources/...
#
#; generateOutputDirectory = ../src/main/java
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o resourceOutputDirectory: (NotRequired - Default '../resources')
# The base output directory for resource files that contain DI configurations.
# Basically you don't need to specify this if your project is under the Maven.
#
#; resourceOutputDirectory = ../resources
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isTableNameCamelCase: (NotRequired - Default false)
# Is the table name camel case?
# Basically you don't need this if the style of table name is like 'FOO_STATUS'.
# [true]
# The table name is camel case.
# e.g. If the table name is 'OrderDetail', the class name is 'OrderDetail'.
#
# [false]
# e.g. If the table name is 'ORDER_DETAIL', the class name is 'OrderDetail'.
# e.g. If the table name is 'OrderDetail', the class name is 'Orderdetail'.
#
#; isTableNameCamelCase = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isColumnNameCamelCase: (NotRequired - Default false)
# Is the column name camel case?
# Basically you don't need this if the style of column name is like 'FOO_NAME'.
# [true]
# The column name is camel case.
# e.g. If the column name is 'OrderDetailId', the class name is 'OrderDetailId'.
#
# [false]
# e.g. If the column name is 'ORDER_DETAIL_ID', the class name is 'OrderDetailId'.
# e.g. If the column name is 'OrderDetailId', the class name is 'Orderdetailid'.
#
#; isColumnNameCamelCase = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o projectPrefix: (NotRequired - Default '')
# If the value is 'Ld', all class names are 'LdXxx'.
# Basically you don't need this if you don't want the common prefix of class name.
#
#; projectPrefix = Ld
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o classAuthor: (NotRequired - Default 'DBFlute(AutoGenerator)')
# The value of the author tag in java-doc of generated classes.
# All classes are target.
#
#; classAuthor = DBFlute(AutoGenerator)
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sourceFileEncoding: (NotRequired - Default 'UTF-8')
# The value of an encoding for source files that are generated classes.
# If source files of your project are not UTF-8, specify your encoding here.
#
#; sourceFileEncoding = UTF-8
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sourceCodeLineSeparator: (NotRequired - Default no setting)
# The line separator setting for source code of generated classes.
# LF -> converted to LF
# CRLF -> converted to CRLF
# (no setting) -> no convert (template default is CRLF)
#
#; sourceCodeLineSeparator = LF
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o applicationBehaviorMap: (NotRequired - Default map:{})
# The settings for Application Behavior.
# Elements of this map are as below:
# o isApplicationBehaviorProject: (NotRequired - Default false)
# Does the project is for application behaviors?
# This property is a main signal for Application Behavior.
# Other properties (for Application Behavior) work when this is true.
# o libraryProjectPackageBase: (NotRequired - Default same as application's one)
# If application package base is different from library's one,
# set the property a value 'library's one'.
#
#; applicationBehaviorMap = map:{
# ; isApplicationBehaviorProject = false
# ; libraryProjectPackageBase =
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o outputPackageAdjustmentMap: (NotRequired - Default map:{})
# The adjustments for output package.
# Elements of this map are as below:
# o flatDirectoryPackage: (Required - Default '')
# This is only for CSharp.
# e.g. Aaa.Bbb.DBFlute --> Directory: source/Aaa.Bbb.DBFlute/AllCommon
# o omitDirectoryPackage: (NotRequired - Default '')
# This is only for CSharp.
# e.g. Aaa --> Directory: source/Bbb/DBFlute/AllCommon
#
#; outputPackageAdjustmentMap = map:{
# ; flatDirectoryPackage = Aaa.Bbb.DBFlute
# ; omitDirectoryPackage = Aaa
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteSystemFinalTimeZone: (NotRequired - Default null)
# The ID of time-zone for DBFlute system.
# basically for e.g. DisplaySql, Date conversion, LocalDate mapping and so on...
#
#; dbfluteSystemFinalTimeZone = GMT
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,56 @@
# /---------------------------------------------------------------------------
# classificationDefinitionMap: (NotRequired - Default map:{})
#
# The definition of classification.
#
# Specification:
# map: {
# [classification-name] = list:{
# ; map:{
# ; topComment=[comment]; codeType=[String(default) or Number or Boolean]}
# ; undefinedHandlingType=[EXCEPTION or LOGGING(default) or ALLOWED]
# ; isUseDocumentOnly=[true or false(default)]
# ; isSuppressAutoDeploy=[true or false(default)]
# ; groupingMap = map:{
# ; [group-name] = map:{
# ; groupComment=[comment]
# ; elementList=list:{[the list of classification element's name]}
# }
# }
# }
# # classification elements for implicit classification
# ; map:{
# ; code=[code]; name=[name]; alias=[alias]; comment=[comment]
# ; sisterCode=[code or code-list]; subItemMap=map:{[free-map]}
# }
# # settings for table classification
# ; map:{
# ; table=[table-name]
# ; code=[column-name for code]; name=[column-name for name]
# ; alias=[column-name for alias]; comment=[column-name for comment]}
# ; where=[condition for select]; orderBy=[column-name for ordering]
# ; exceptCodeList=[the list of except code]
# }
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
# example for implicit classification
#; Flg = list:{
# ; map:{topComment=general boolean classification for every flg-column; codeType=Number}
# ; map:{code=1; name=True ; alias=Checked ; comment=means yes; sisterCode=true}
# ; map:{code=0; name=False; alias=Unchecked; comment=means no ; sisterCode=false}
#}
# example for table classification
#; MemberStatus = list:{
# ; map:{topComment=status of member from entry to withdrawal; codeType=String}
# ; map:{
# ; table=MEMBER_STATUS
# ; code=MEMBER_STATUS_CODE; name=MEMBER_STATUS_NAME
# ; comment=DESCRIPTION; orderBy=DISPLAY_ORDER
# }
#}
}
# ----------------/

View file

@ -0,0 +1,25 @@
# /---------------------------------------------------------------------------
# classificationDeploymentMap: (NotRequired - Default map:{})
#
# The relation between column and classification.
#
# This property uses classification names of classificationDefinitionMap.
# The table name '$$ALL$$' means all tables are target.
# The table names and column names are treated as case insensitive.
#
# You don't need specify here about table classifications.
# Because table classifications are auto-deployed by relation information.
#
# Specification:
# map: {
# [table-name or $$ALL$$] = map:{
# ; [column-name (with hint)]=[classification-name]
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; $$ALL$$ = map:{suffix:_FLG=Flg}
}
# ----------------/

View file

@ -0,0 +1,49 @@
# /---------------------------------------------------------------------------
# commonColumnMap: (Default map:{})
#
# The definition of common column(contains auto set-up).
# For example, the date you registered the record,
# the user who updated the record and so on...
# The column names are treated as case insensitive.
#
# The variable '$$AccessContext$$' means allcommon.AccessContext.
#
# Example:
# map:{
# ; commonColumnMap = map:{
# ; REGISTER_DATETIME=TIMESTAMP ; REGISTER_USER=VARCHAR
# ; UPDATE_DATETIME=TIMESTAMP ; UPDATE_USER=VARCHAR
# }
# ; beforeInsertMap = map:{
# ; REGISTER_DATETIME = $$AccessContext$$.getAccessLocalDateTimeOnThread()
# ; REGISTER_USER = $$AccessContext$$.getAccessUserOnThread()
# ; UPDATE_DATETIME = entity.getRegisterDatetime()
# ; UPDATE_USER = entity.getRegisterUser()
# }
# ; beforeUpdateMap = map:{
# ; UPDATE_DATETIME = $$AccessContext$$.getAccessLocalDateTimeOnThread()
# ; UPDATE_USER = $$AccessContext$$.getAccessUserOnThread()
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; commonColumnMap = map:{
# ; REGISTER_DATETIME=TIMESTAMP ; REGISTER_USER=VARCHAR
# ; UPDATE_DATETIME=TIMESTAMP ; UPDATE_USER=VARCHAR
#}
#; beforeInsertMap = map:{
# ; REGISTER_DATETIME = $$AccessContext$$.getAccessLocalDateTimeOnThread()
# ; REGISTER_USER = $$AccessContext$$.getAccessUserOnThread()
# ; UPDATE_DATETIME = entity.getRegisterDatetime()
# ; UPDATE_USER = entity.getRegisterUser()
#}
#; beforeUpdateMap = map:{
# ; UPDATE_DATETIME = $$AccessContext$$.getAccessLocalDateTimeOnThread()
# ; UPDATE_USER = $$AccessContext$$.getAccessUserOnThread()
#}
}
# ----------------/
#
# *Refer to typeMappingMap.dfprop for JDBC type reference.

View file

@ -0,0 +1,112 @@
# /---------------------------------------------------------------------------
# databaseInfoMap: (Required)
#
# The database information for the tasks of DBFlute.
# You should specify before your first generating.
#
# o driver -- The class name of JDBC-Driver.
# o url -- The URL for connecting database.
# o schema -- The schema name.
# o user -- The database user name.
# o password -- The database password.
# o propertiesMap -- The properties that depends on the database.
# o variousMap -- The various settings about JDBC task.
#
# *The line that starts with '#' means comment-out.
#
map:{
; driver = org.h2.Driver
; url = jdbc:h2:file:../src/main/resources/fess
; schema =
; user = sa
; password =
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o propertiesMap: (NotRequired - Default map:{})
# The properties that depends on the database.
#
; propertiesMap = map:{
# If you use Oracle and its Synonym, specify this property.
#; includeSynonyms=true
}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o variousMap: (NotRequired - Default map:{})
# The various settings about JDBC task mainly.
#
; variousMap = map:{
# o objectTypeTargetList: (NotRequired - Default list:{TABLE;VIEW})
# If you want to include other object types in generating target,
# you should specify the list of included object types as adding.
# e.g. Synonym of Oracle --> list:{TABLE ; VIEW ; SYNONYM}
# This is only for the main schema. Additional schemas are unconcerned.
# However ReplaceSchema and Sql2Entity task also uses this.
# But you can set ReplaceSchema-original setting in its own dfprop.
#
#; objectTypeTargetList = list:{TABLE ; VIEW}
# o tableExceptList: (NotRequired - Default list:{})
# If you want to exclude some tables in generating target,
# you should specify the list of excepted table hints.
# e.g. list:{PRODUCT_STATUS ; prefix:TMP_}
# This is only for the main schema. Additional schemas are unconcerned.
# And ReplaceSchema task basically ignores this.
#
# Normally this 'except' means no getting meta data for excepted tables.
# (so the tables are not existing in SchemaHTML and HistoryHTML and so on).
# But you can specify the '@gen' suffix that means generate-only except.
# A table with the mark can be treated as documents but no generating classes.
#
#; tableExceptList = list:{FOO_TABLE@gen ; prefix:FOO_@gen ; suffix:_FOO ; contain:_FOO_}
# o tableTargetList: (NotRequired - Default list:{})
# If you want to include some tables in generating target expressly,
# you should specify the list of target table hints.
# e.g. list:{PURCHASE ; contain:MEMBER}
# This is only for the main schema. Additional schemas are unconcerned.
# And ReplaceSchema task basically ignores this.
#
#; tableTargetList = list:{FOO_TABLE ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# o columnExceptMap: (NotRequired - Default map:{})
# If you want to exclude some columns in generating target,
# you should specify the list of excepted column hints.
# e.g. map:{HEAVY_MASTER = list:{APP_NOT_USED_ID; suffix:_IMAGE}}
# This is only for the main schema. Additional schemas are unconcerned.
#
#; columnExceptMap = map:{
# ; FOO_TABLE = list:{FOO_COLUMN ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
#}
# o additionalSchemaMap: (NotRequired - Default map:{})
# If you want to include other schemas in generating target,
# you should specify the map of included schemas.
# Additional schemas have original settings apart from the main schema.
# The settings are objectTypeTargetList, tableExceptList,
# tableTargetList, and columnExceptMap.
# They have the same specification as ones of the main schema.
# Elements of this map are as below:
# o objectTypeTargetList: (NotRequired - Default 'map:{TABLE;VIEW}')
# o tableExceptList: (NotRequired - Default list:{})
# o tableTargetList: (NotRequired - Default list:{})
# o columnExceptMap: (NotRequired - Default map:{})
# o isSuppressCommonColumn: (NotRequired - Default false)
# o isSuppressProcedure: (NotRequired - Default false)
#
#; additionalSchemaMap = map:{
# ; NEXTEXAMPLEDB = map:{
# ; objectTypeTargetList=list:{TABLE ; VIEW}
# ; tableExceptList=list:{FOO_TABLE ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# ; tableTargetList=list:{FOO_TABLE ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# ; columnExceptMap=map:{
# ; FOO_TABLE = list:{FOO_COLUMN ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# }
# ; isSuppressCommonColumn=false
# ; isSuppressProcedure=false
# }
#}
}
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,125 @@
# /---------------------------------------------------------------------------
# dependencyInjectionMap: (NotRequired - Default map:{})
#
# The various settings about dependency injection(DI Container).
#
# {Java} Seasar Only:
# o dbfluteDiconNamespace: (NotRequired - Default 'dbflute')
# o dbfluteDiconPackageName (NotRequired - Default '../resources')
# o dbfluteDiconFileName: (NotRequired - Default 'dbflute.dicon')
# o j2eeDiconResourceName: (NotRequired - Default 'j2ee.dicon')
# o dbfluteDiconBeforeJ2eeIncludeDefinitionMap: (NotRequired - Default map:{})
# o dbfluteDiconOtherIncludeDefinitionMap: (NotRequired - Default map:{})
#
# {Java} Spring Only:
# o dbfluteBeansPackageName (NotRequired - Default '../resources')
# o dbfluteBeansFileName: (NotRequired - Default 'dbfluteBeans.xml')
# o dbfluteBeansDataSourceName: (NotRequired - Default 'dataSource')
# o dbfluteBeansDefaultAttribute: (NotRequired - Default null)
# o isDBFluteBeansGeneratedAsJavaConfig (NotRequired - Default true since 1.1)
#
# {CSharp} Quill(CSharp Seasar) Only:
# o quillDataSourceName: (NotRequired - Default null)
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteDiconNamespace: (NotRequired - Default 'dbflute')
# The namespace of DBFlute DI configuration.
#
# @SeasarOnly
#; dbfluteDiconNamespace = dbflute
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteDiconPackageName (NotRequired - Default '../resources')
# The package name(output directory) of DBFlute DI configuration for Seasar.
#
# @SeasarOnly
#; dbfluteDiconPackageName = ../resources
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteDiconFileName: (NotRequired - Default 'dbflute.dicon')
# The file name of DBFlute DI configuration for Seasar.
#
# @SeasarOnly
#; dbfluteDiconFileName = dbflute.dicon
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o j2eeDiconResourceName: (NotRequired - Default 'j2ee.dicon')
# The file name of J2EE DI configuration.
#
# @SeasarOnly
#; j2eeDiconResourceName = j2ee.dicon
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteDiconBeforeJ2eeIncludeDefinitionMap: (NotRequired - Default map:{})
# The include definition of DBFlute DI configuration before j2ee including.
# e.g. map:{ jdbc-xxx.dicon = dummy }
#
# @SeasarOnly
#; dbfluteDiconBeforeJ2eeIncludeDefinitionMap = map:{}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteDiconOtherIncludeDefinitionMap: (NotRequired - Default map:{})
# The other include definition of DBFlute DI configuration.
# e.g. map:{ common.dicon = dummy }
#
# @SeasarOnly
#; dbfluteDiconOtherIncludeDefinitionMap = map:{}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteBeansPackageName (NotRequired - Default '../resources')
# The package name(output directory) of DBFlute DI configuration for Spring and Lucy.
#
# @SpringOnly
#; dbfluteBeansPackageName = ../resources
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteBeansFileName: (NotRequired - Default 'dbfluteBeans.xml')
# The file name of DBFlute DI configuration for Spring and Lucy.
#
# @SpringOnly
#; dbfluteBeansFileName = dbfluteBeans.xml
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteBeansDataSourceName: (NotRequired - Default 'dataSource')
# The data source name that DBFlute(Behaviors) uses.
#
# @SpringOnly
#; dbfluteBeansDataSourceName = exampleDataSource
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o dbfluteBeansPackageName (NotRequired - Default null)
# The default attribute expression of DBFlute DI configuration for Spring and Lucy.
#
# @SpringOnly
#; dbfluteBeansDefaultAttribute = default-lazy-init="true"
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isDBFluteBeansGeneratedAsJavaConfig (NotRequired - Default true since 1.1)
# Does it generate JavaConfig for DBFluteBeans? (not use XML configuration)
#
# @SpringOnly
#; isDBFluteBeansGeneratedAsJavaConfig = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o quillDataSourceName: (NotRequired - Default null)
# The data source name of Quill(CSharp Seasar).
#
# @QuillOnly
#; quillDataSourceName = ExampleDB
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,225 @@
# /---------------------------------------------------------------------------
# documentDefinitionMap: (NotRequired - Default map:{})
#
# o documentOutputDirectory (NotRequired - Default './output/doc')
# o aliasDelimiterInDbComment (NotRequired - Default '')
# o isDbCommentOnAliasBasis (NotRequired - Default false)
# o isEntityJavaDocDbCommentValid (NotRequired - Default true)
# o isEntityDBMetaDbCommentValid (NotRequired - Default false)
# o schemaHtmlFileName (NotRequired - Default 'schema-[project-name].html')
# o isSuppressSchemaHtmlOutsideSql (NotRequired - Default false)
# o isSuppressSchemaHtmlProcedure (NotRequired - Default false)
# o historyHtmlFileName (NotRequired - Default 'history-[project-name].html')
# o isCheckColumnDefOrderDiff (NotRequired - Default false)
# o isCheckDbCommentDiff (NotRequired - Default false)
# o isCheckProcedureDiff (NotRequired - Default false)
# o loadDataReverseMap (NotRequired - Default map:{})
# o schemaSyncCheckMap (NotRequired - Default map:{})
# o propertiesHtmlMap: (NotRequired - Default map:{})
#
# Example:
# map:{
# ; documentOutputDirectory = ./output/doc
# ; aliasDelimiterInDbComment = :
# ; isDbCommentOnAliasBasis = true
# ; isEntityJavaDocDbCommentValid = true
# ; isEntityDBMetaDbCommentValid = true
# ; schemaHtmlFileName = xxx.html
# ; isSuppressSchemaHtmlOutsideSql = false
# ; isSuppressSchemaHtmlProcedure = false
# ; historyHtmlFileName = xxx.html
# ; isCheckColumnDefOrderDiff = true
# ; isCheckDbCommentDiff = true
# ; isCheckProcedureDiff = true
# ; loadDataReverseMap = map:{
# ; recordLimit = -1
# ; isReplaceSchemaDirectUse = true
# ; isOverrideExistingDataFile = false
# ; isSynchronizeOriginDate = false
# }
# ; schemaSyncCheckMap = map:{
# ; url = jdbc:...
# ; schema = EXAMPLEDB
# ; user = exampuser
# ; password = exampword
# }
# ; propertiesHtmlMap = map:{
# ; ApplicationProperties = map:{
# ; rootFile = ../src/main/resources/application_ja.properties
# }
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o documentOutputDirectory (NotRequired - Default './output/doc')
# The output directory mainly for SchemaHtml and DataXlsTemplate.
# Basically you don't need this.
# It is considered of value that it always exists at same plain.
#
#; documentOutputDirectory = ./output/doc
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o aliasDelimiterInDbComment (NotRequired - Default '')
# If the alias exists in its DB comment like as follows:
# member name : The name of member's full name
# you can use the alias in DBFlute world, java-doc, SchemaHTML...
# DB comment which does not have the delimiter is not treated
# as alias, treated as description (real comment).
# But you can change it by 'isDbCommentOnAliasBasis'.
#
#; aliasDelimiterInDbComment = :
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isDbCommentOnAliasBasis (NotRequired - Default false)
# Is DB comment on alias basis?
# (Is DB comment alias name when it has no alias delimiter?)
# This property works with 'aliasDelimiterInDbComment'.
#
#; isDbCommentOnAliasBasis = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isEntityJavaDocDbCommentValid (NotRequired - Default true)
# Does it allow DB comment to be on java-doc?
#
#; isEntityJavaDocDbCommentValid = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isEntityDBMetaDbCommentValid (NotRequired - Default false)
# Does it allow DB comment to be on DB meta of entity?
#
#; isEntityDBMetaDbCommentValid = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o schemaHtmlFileName (NotRequired - Default 'schema-[project-name].html')
# The file name (not contain path) of SchemaHtml.
# Basically you don't need this.
# (For example, when you use Application Behavior, you need this)
#
#; schemaHtmlFileName = xxx.html
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressSchemaHtmlOutsideSql (NotRequired - Default false)
# Does it remove outsideSql information from SchemaHtml?
# Basically you don't need this.
# OutsideSql information (related to tables) is very useful.
#
#; isSuppressSchemaHtmlOutsideSql = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressSchemaHtmlProcedure (NotRequired - Default false)
# Does it remove procedure information from SchemaHtml?
# Basically you don't need this.
# Procedure information is very useful.
#
#; isSuppressSchemaHtmlProcedure = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o historyHtmlFileName (NotRequired - Default 'history-[project-name].html')
# The file name (not contain path) of HistoryHtml.
# Basically you don't need this.
# (For example, when you use Application Behavior, you need this)
#
#; historyHtmlFileName = xxx.html
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isCheckColumnDefOrderDiff (NotRequired - Default false)
# Does it check differences of column-def order?
# (except added or deleted columns)
#
#; isCheckColumnDefOrderDiff = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isCheckDbCommentDiff (NotRequired - Default false)
# Does it check differences of table or column or others comment?
#
#; isCheckDbCommentDiff = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isCheckProcedureDiff (NotRequired - Default false)
# Does it check differences of procedures?
#
#; isCheckProcedureDiff = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o loadDataReverseMap: (NotRequired - Default map:{})
# You can set LoadDataReverse settings.
# This property is valid when the property 'recordLimit' is set.
# Elements of this map are as below:
# o recordLimit: The limit of records to output. Minus means no limit. (NotRequired - Default '')
# o isReplaceSchemaDirectUse: Does it output the data to playsql directly? (NotRequired - Default false)
# o isOverrideExistingDataFile: Does it output to existing files? (NotRequired - Default false)
# o isSynchronizeOriginDate: Does it synchronize origin date for date adjustment? (NotRequired - Default false)
#
; loadDataReverseMap = map:{
; recordLimit = -1
; isReplaceSchemaDirectUse = true
; isOverrideExistingDataFile = false
; isSynchronizeOriginDate = false
}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o schemaSyncCheckMap: (NotRequired - Default map:{})
# You can set SchemaSyncCheck settings.
# This property is valid when the property 'user' is set.
# Elements of this map are as below:
# o url: The URL for connecting database. (NotRequired - Default same as databaseInfoMap)
# o schema: The schema name. (NotRequired - Default '' e.g. no setting when MySQL)
# o user: The database user name. (Required)
# o password: The database password. (NotRequired - Default '')
#
#; schemaSyncCheckMap = map:{
# ; url = jdbc:...
# ; schema = EXAMPLEDB
# ; user = exampuser
# ; password = exampword
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o propertiesHtmlMap: (NotRequired - Default map:{})
# You can set PropertiesHtml settings.
# Elements of this map are as below:
# o key of map: Properties Title Name
# o baseDir: base directory for directory property. (NotRequired)
# o rootFile: root file to read properties (Required)
# o environmentMap: map of environment files, the value is dir path (NotRequired)
# o diffIgnoredKeyList: list of ignored keys for differences (NotRequired)
# o maskedKeyList: list of masked keys for security (NotRequired)
# o isEnvOnlyFloatLeft: is it environment only? (and show as float-left?) (NotRequired)
# o extendsPropRequest: other request name of exnteds-properties (NotRequired)
# o isCheckImplicitOverride: does it check implicit override? (NotRequired)
#
#; propertiesHtmlMap = map:{
# ; ApplicationProperties = map:{
# ; baseDir = ../src
# ; rootFile = $$baseDir$$/main/resources/application_ja.properties
# ; environmentMap = map:{
# ; integration = $$baseDir$$/integration/resources
# ; production = $$baseDir$$/production/resources
# }
# ; diffIgnoredKeyList = list:{}
# ; maskedKeyList = list:{}
# ; isEnvOnlyFloatLeft = false
# ; extendsPropRequest = null
# ; isCheckImplicitOverride = false
# }
#}
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,55 @@
map:{
; ElasticsearchFessConfigGen = map:{
; resourceMap = map:{
; resourceType = ELASTICSEARCH
; resourceFile = ../src/main/config/es/fess_config.json
}
; outputMap = map:{
; templateFile = unused
; outputDirectory = ../src/main/java
; package = org.codelibs.fess.es
; className = unused
}
; tableMap = map:{
; tablePath = .fess_config -> mappings -> map
; mappingMap = map:{
; type = map:{
; string = String
; integer = Integer
; long = Long
; float = Float
; double = Double
; boolean = Boolean
; date = java.time.LocalDateTime
}
}
}
}
; ElasticsearchSearchLogGen = map:{
; resourceMap = map:{
; resourceType = ELASTICSEARCH
; resourceFile = ../src/main/config/es/search_log.json
}
; outputMap = map:{
; templateFile = unused
; outputDirectory = ../src/main/java
; package = org.codelibs.fess.es
; className = unused
}
; tableMap = map:{
; tablePath = search_log -> mappings -> map
; mappingMap = map:{
; type = map:{
; string = String
; integer = Integer
; long = Long
; float = Float
; double = Double
; boolean = Boolean
; date = java.time.LocalDateTime
}
}
}
}
}

View file

@ -0,0 +1,99 @@
# /---------------------------------------------------------------------------
# includeQueryMap: (NotRequired - Default map:{})
#
# Specification:
# map:{
# ; [property-type] = map:{
# ; [condition-key] = map:{ [table] = list:{ [column] ; [column] } }
# }
#
# property-type: String, Number, Date, OrderBy, ...
# condition-key: NotEqual, GreaterThan, LessThan, GreaterEqual, LessEqual
# , InScope, NotInScope, PrefixSearch, LikeSearch, NotLikeSearch
# , EmptyString, FromTo, DateFromTo, RangeOf, ...
# , (and prefix '!' means excluding, '%' means reviving)
# table: table name (hint) or $$ALL$$
# column: column name (hint) or $$CommonColumn$$ or $$VersionNo$$
#
# Example:
# map:{
# # This means that String includes GreaterThan at MEMBER.MEMBER_ACCOUNT only
# # and LessThan at PRODUCT.PRODUCT_NAME and PRODUCT.PRODUCT_HANDLE_CODE,
# # and InScope for LONGVARCHAR(e.g. text type) is excluded.
# ; String = map:{
# ; GreaterThan = map:{ MEMBER = list:{ MEMBER_ACCOUNT } }
# ; LessThan = map:{ PRODUCT = list:{ PRODUCT_NAME ; PRODUCT_HANDLE_CODE } }
# ; !InScope = map:{ $$ALL$$ = list:{ type:LONGVARCHAR } }
# }
# # This means that Number excludes all version-no's NotEqual.
# ; Number = map:{
# ; !NotEqual = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
# }
# # This means that Date does not includes NotEqual at all tables.
# ; Date = map:{
# ; NotEqual = map:{}
# }
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
; String = map:{
# [Include]
# String columns may not be needed
# to be set these condition-keys basically.
#; GreaterThan = map:{}
#; LessThan = map:{}
#; GreaterEqual = map:{}
#; LessEqual = map:{}
# [Exclude]
# Common columns of String type may not be needed
# to be set these condition-keys basically.
#; !NotEqual = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !GreaterThan = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !LessThan = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !GreaterEqual = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !LessEqual = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !InScope = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !NotInScope = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !PrefixSearch = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !LikeSearch = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !NotLikeSearch = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
}
; Number = map:{
# [Include]
# ...
# [Exclude]
# VersionNo column may not be needed
# to be set these condition-keys basically.
#; !NotEqual = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !GreaterThan = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !LessThan = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !GreaterEqual = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !LessEqual = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !RangeOf = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !InScope = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
#; !NotInScope = map:{ $$ALL$$ = list:{ $$VersionNo$$ } }
}
; Date = map:{
# [Include]
# Date columns may not be needed
# to be set these condition-keys basically.
; NotEqual = map:{}
; InScope = map:{}
; NotInScope = map:{}
# [Exclude]
# Common columns of Date type may not be needed
# to be set these condition-keys basically.
#; !GreaterThan = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !LessThan = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !GreaterEqual = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !LessEqual = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !FromTo = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
#; !DateFromTo = map:{ $$ALL$$ = list:{ $$CommonColumn$$ } }
}
}
# ----------------/

View file

@ -0,0 +1,282 @@
# /---------------------------------------------------------------------------
# littleAdjustmentMap: (NotRequired - Default map:{})
#
# The various settings about a little adjustment.
#
# o isAvailableAddingSchemaToTableSqlName: (NotRequired - Default false)
# o isAvailableAddingCatalogToTableSqlName: (NotRequired - Default false)
# o isAvailableDatabaseDependency: (NotRequired - Default false)
# o isAvailableDatabaseNativeJDBC: (NotRequired - Default false)
# o isAvailableNonPrimaryKeyWritable: (NotRequired - Default false)
# o classificationUndefinedHandlingType: (NotRequired - Default LOGGING)
# o isEntityConvertEmptyStringToNull: (NotRequired - Default false)
# o isMakeConditionQueryEqualEmptyString: (NotRequired - Default false)
# o isTableDispNameUpperCase: (NotRequired - Default false)
# o isTableSqlNameUpperCase: (NotRequired - Default false)
# o isColumnSqlNameUpperCase: (NotRequired - Default false)
# o isMakeDeprecated: (NotRequired - Default false)
# o isMakeRecentlyDeprecated: (NotRequired - Default true)
# o extendedDBFluteInitializerClass: (NotRequired - Default null)
# o extendedImplementedInvokerAssistantClass: (NotRequired - Default null)
# o extendedImplementedCommonColumnAutoSetupperClass: (NotRequired - Default null)
# o shortCharHandlingMode: (NotRequired - Default NONE)
# o quoteTableNameList: (NotRequired - Default list:{})
# o quoteColumnNameList: (NotRequired - Default list:{})
# o columnNullObjectMap: (NotRequired - Default map:{})
# o relationalNullObjectMap: (NotRequired - Default map:{})
# o cursorSelectFetchSize: (NotRequired - Default null)
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isAvailableAddingSchemaToTableSqlName: (NotRequired - Default false)
# [true]
# Add schema to table SQL name. (The table name on query is SCHEMA.TABLE)
#
# [false]
# Non.
#
#; isAvailableAddingSchemaToTableSqlName = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isAvailableAddingCatalogToTableSqlName: (NotRequired - Default false)
# [true]
# Add catalog to table SQL name. (The table name on query is CATALOG.SCHEMA.TABLE)
# This property works only when isAvailableAddingSchemaToTableSqlName is true.
#
# [false]
# Non.
#
#; isAvailableAddingCatalogToTableSqlName = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isAvailableDatabaseDependency: (NotRequired - Default false)
# [true]
# Generate the method that depends on the database.
# For example: cb.lockWithRR() at DB2.
#
# [false]
# Non.
#
#; isAvailableDatabaseDependency = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isAvailableDatabaseNativeJDBC: (NotRequired - Default false)
# [true]
# Use classes of database native JDBC on generated classes
# to get best performances of DB access.
# Your project needs to refer to database native JDBC.
#
# [false]
# Non.
#
#; isAvailableDatabaseNativeJDBC = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isAvailableNonPrimaryKeyWritable: (NotRequired - Default false)
# [true]
# Generate writable methods at non-primary-key table.
#
# [false]
# Non.
#
#; isAvailableNonPrimaryKeyWritable = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o classificationUndefinedHandlingType: (NotRequired - Default LOGGING)
# The handling type when undefined classification is found.
#
# EXCEPTION - throws exception when found
# LOGGING - logging only when found (exception if ReplaceSchema)
# ALLOWED - no action
#
#; classificationUndefinedHandlingType = LOGGING
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isEntityConvertEmptyStringToNull: (NotRequired - Default false)
# [true]
# Convert empty-string data to null in entity.
#
# [false]
# Non.
#
#; isEntityConvertEmptyStringToNull = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isMakeConditionQueryEqualEmptyString: (NotRequired - Default false)
# [true]
# Make equal-empty-string methods of condition-query.
# For example: cb.query().setMemberName_Equal_EmptyString()
#
# [false]
# Non.
#
#; isMakeConditionQueryEqualEmptyString = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isTableDispNameUpperCase: (NotRequired - Default false)
# [true]
# Table names for display, e.g. on documents,
# are forcedly treated as upper case.
#
# [false]
# Non.
#
#; isTableDispNameUpperCase = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isTableSqlNameUpperCase: (NotRequired - Default false)
# [true]
# Table names on SQL executed by condition-bean or behavior
# are forcedly treated as upper case. (except outside-SQL)
#
# [false]
# Non.
#
#; isTableSqlNameUpperCase = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isColumnSqlNameUpperCase: (NotRequired - Default false)
# [true]
# Column names on SQL executed by condition-bean or behavior
# are forcedly treated as upper case. (except outside-SQL)
#
# [false]
# Non.
#
#; isColumnSqlNameUpperCase = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isMakeDeprecated: (NotRequired - Default false)
# [true]
# Make deprecated method and class and so on...
# *You should specify this property 'false'!
#
# [false]
# Non.
#
#; isMakeDeprecated = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isMakeRecentlyDeprecated: (NotRequired - Default true)
# [true]
# Make RECENTLY deprecated method and class and so on...
# *You should specify this property 'false'!
#
# [false]
# Non.
#
#; isMakeRecentlyDeprecated = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o extendedDBFluteInitializerClass: (NotRequired - Default null)
# If you want to extend the embedded DBFlute initializer,
# specify the class name of your original initializer
# that extends the embedded one.
# *Basically for fixed DBFluteConfig settings
#
#; extendedDBFluteInitializerClass = com.example.ExtendedDBFluteInitializer
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o extendedImplementedInvokerAssistantClass: (NotRequired - Default null)
# If you want to extend the embedded invoker assistant,
# specify the class name of your original invoker assistant
# that extends the embedded one.
# *Basically you SHOULD NOT specify this property!
#
#; extendedImplementedInvokerAssistantClass = com.example.ExtendedImplementedInvokerAssistant
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o extendedImplementedCommonColumnAutoSetupperClass: (NotRequired - Default null)
# If you want to extend the embedded common column auto setupper,
# specify the class name of your original common column auto setupper
# that extends the embedded one.
# *Basically you SHOULD NOT specify this property!
#
#; extendedImplementedCommonColumnAutoSetupperClass = com.example.ExtendedImplementedCommonColumnAutoSetupper
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o shortCharHandlingMode: (NotRequired - Default NONE)
# If the parameter of condition-bean or parameter-bean has short size,
# NONE - Do nothing. (default)
# EXCEPTION - It throws an exception.
# RFILL - It fills the parameter by right spaces.
# LFILL - It fills the parameter by left spaces.
#
#; shortCharHandlingMode = NONE
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o quoteTableNameList: (NotRequired - Default list:{})
# The list of table DB names that need to be quoted.
# Specified tables is quoted on auto-generated SQL.
#
#; quoteTableNameList = list:{}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o quoteColumnNameList: (NotRequired - Default list:{})
# The list of column DB names that need to be quoted.
# Specified columns is quoted on auto-generated SQL.
#
#; quoteColumnNameList = list:{}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o columnNullObjectMap: (NotRequired - Default map:{})
# You can get a null object when the column is null.
#
#; columnNullObjectMap = map:{
# ; providerPackage = $$packageBase$$.nogen.cache
# ; isGearedToSpecify = true
# ; columnMap = map:{
# ; MEMBER_STATUS = map:{
# ; DESCRIPTION = CachedMemberStatus.get(this, "$$columnName$$", $$primaryKey$$)
# }
# ; MEMBER_SECURITY = map:{
# ; REMINDER_ANSWER = CachedMemberSecurity.get(this, "$$columnName$$", $$primaryKey$$)
# ; REMINDER_QUESTION = CachedMemberSecurity.get(this, "$$columnName$$", $$primaryKey$$)
# }
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o relationalNullObjectMap: (NotRequired - Default map:{})
# You can get a null object when the relation is null.
#
#; relationalNullObjectMap = map:{
# ; providerPackage = $$packageBase$$.nogen.cache
# ; foreignMap = map:{
# ; MEMBER_STATUS = CachedMemberStatus.get(this, "$$foreignPropertyName$$", $$primaryKey$$)
# ; MEMBER_SECURITY = CachedMemberSecurity.get(this, "$$foreignPropertyName$$", $$primaryKey$$)
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o cursorSelectFetchSize: (NotRequired - Default null)
# The fetch size of JDBC parameter for cursor select.
# For example, specify Integer.MIN_VALUE to enable fetch of MySQL.
#
#; cursorSelectFetchSize = Integer.MIN_VALUE
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,28 @@
# /---------------------------------------------------------------------------
# optimisticLockDefinitionMap: (NotRequired - Default map:{})
#
# The definition for optimistic lock of DBFlute.
#
# o updateDateFieldName: (NotRequired - Default '')
# o versionNoFieldName: (NotRequired - Default 'VERSION_NO')
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o updateDateFieldName: (NotRequired - Default '')
# The column name of update date for optimistic lock.
#
#; updateDateFieldName = UPDATE_DATE
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o versionNoFieldName: (NotRequired - Default 'VERSION_NO')
# The column name of version no for optimistic lock.
# Basically you don't need this if your tables have the column 'VERSION_NO'.
# because the default value is 'VERSION_NO'.
#
#; versionNoFieldName = VERSION_NO
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,187 @@
# /---------------------------------------------------------------------------
# outsideSqlDefinitionMap: (NotRequired - Default map:{})
#
# The various settings about outsideSql.
#
# o isGenerateProcedureParameterBean: (NotRequired - Default false)
# o isGenerateProcedureCustomizeEntity: (NotRequired - Default false)
# o targetProcedureCatalogList: (NotRequired - Default list:{})
# o targetProcedureSchemaList: (NotRequired - Default list:{})
# o targetProcedureNameList: (NotRequired - Default list:{})
# o executionMetaProcedureNameList: (NotRequired - Default list:{})
# o procedureSynonymHandlingType: (NotRequired - Default NONE)
# o isRequiredSqlTitle: (NotRequired - Default true)
# o isRequiredSqlDescription: (NotRequired - Default true)
# o sqlFileEncoding: (NotRequired - Default 'UTF-8')
# o sqlDirectory: (NotRequired - Default generateOutputDirectory & resourceOutputDirectory)
# o sql2EntityOutputDirectory: (NotRequired - Default generateOutputDirectory)
# o applicationOutsideSqlMap: (NotRequired - Default map:{})
# o sqlPackage: (NotRequired - Default all packages)
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isGenerateProcedureParameterBean: (NotRequired - Default false)
# [true]
# The parameter beans for procedure are auto-generated.
# If you call the procedure from DBFlute, you should specify 'true'!
#
# [false]
# Non.
#
; isGenerateProcedureParameterBean = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isGenerateProcedureCustomizeEntity: (NotRequired - Default false)
# [true]
# The customize entities for procedure's out-parameter
# and not-param-result are auto-generated.
# And also not-param-result's properties are auto-generated.
# Target procedures are executed actually at Sql2Entity task.
# (because of getting from execution meta data (result set meta data))
# This property is valid only when isGenerateProcedureParameterBean is true.
#
# [false]
# Non.
#
; isGenerateProcedureCustomizeEntity = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o targetProcedureCatalogList: (NotRequired - Default list:{})
# You can specify target catalog of generated parameter bean for procedure.
# This property is valid only when generateProcedureParameterBean is valid.
#
#; targetProcedureCatalogList = list:{FOO_CATALOG ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o targetProcedureSchemaList: (NotRequired - Default list:{})
# You can specify target schema of generated parameter bean for procedure.
# This property is valid only when generateProcedureParameterBean is valid.
# e.g. list:{PROCEDUREDB}
#
#; targetProcedureSchemaList = list:{FOO_SCHEMA ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o targetProcedureNameList: (NotRequired - Default list:{})
# You can specify target name of generated parameter bean for procedure.
# This property is valid only when isGenerateProcedureParameterBean is valid.
# e.g. list:{prefix:SP_}
# And you can specify procedures through DB link.
# This is treated as additional setting
# so it is independent from specifications for main schema.
# e.g. SP_FOO@NEXT_LINK (when DB link name is 'NEXT_LINK')
#
#; targetProcedureNameList = list:{FOO_PROCEDURE ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o executionMetaProcedureNameList: (NotRequired - Default list:{})
# You can specify target name of generated customize entity for procedure.
# This property is valid only when isGenerateProcedureCustomizeEntity is valid.
# e.g. list:{prefix:SP_}
#
#; executionMetaProcedureNameList = list:{FOO_PROCEDURE ; prefix:FOO_ ; suffix:_FOO ; contain:_FOO_}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o procedureSynonymHandlingType: (NotRequired - Default NONE)
# You can specify the handling type of procedure synonym.
# NONE - No handling. (default)
# INCLUDE - It includes procedure synonyms.
# SWITCH - It switches all normal procedures to procedure synonyms.
#
#; procedureSynonymHandlingType = NONE
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isRequiredSqlTitle: (NotRequired - Default true)
# [true]
# You should always write the title of outsideSql.
# If it doesn't exist, the OutsideSqlTest task fails.
#
# [false]
# Non.
#
#; isRequiredSqlTitle = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isRequiredSqlDescription: (NotRequired - Default true)
# [true]
# You should always write the description of outsideSql.
# If it doesn't exist, the OutsideSqlTest task fails.
#
# [false]
# Non.
#
#; isRequiredSqlDescription = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sqlFileEncoding: (NotRequired - Default 'UTF-8')
# The encoding of SQL file for outsideSql.
# Basically you don't need this.
#
#; sqlFileEncoding = UTF-8
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sqlDirectory: (NotRequired - Default generateOutputDirectory & resourceOutputDirectory)
# The directory of SQL file for outsideSql.
# Basically you don't need this if your directory structure is same as default.
# It's also for DBFlute library project when you use ApplicationOutsideSql.
#
#; sqlDirectory = ../src/main/resources
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sql2EntityOutputDirectory: (NotRequired - Default generateOutputDirectory)
# The output directory of classes that is generated by Sql2Entity.
# Basically you don't need this if your directory structure is same as default.
# It's also for DBFlute library project when you use ApplicationOutsideSql.
#
#; sql2EntityOutputDirectory = ../src/main/java
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o applicationOutsideSqlMap: (NotRequired - Default map:{})
# You can set additional users.
# Elements of this map are as below:
# o key of map: a relative path to the application project from DBFlute client
# o sqlDirectory: SQL directory as a relative path from the application directory
# (NotRequired - Default Java:'src/main/java' & 'src/main/resources' CSharp:'source')
# o sql2EntityOutputDirectory: source output directory from as a relative path from the application directory
# (NotRequired - Default Java:'src/main/java' CSharp:'source')
#
#; applicationOutsideSqlMap = map:{
# ; ../../app1 = map:{
# ; sqlDirectory = src/main/resources
# ; sql2EntityOutputDirectory = src/main/java
# }
# ; ../../app2 = map:{
# ; sqlDirectory = src/main/resources
# ; sql2EntityOutputDirectory = src/main/java
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sqlPackage: (NotRequired - Default all packages)
# The package of SQL file for outsideSql.
# This is basically for narrowing SQL-searching target,
# for example, when the project has SQL files for other framework.
# So basically you don't need this.
#
# You can use variable '$$PACKAGE_BASE$$' that means 'packageBase'.
# But you need to make SQL files at 'exbhv' under the set package
# if you use BehaviorQueryPath (MemberBhv_selectSimpleMember.sql).
#
#; sqlPackage = $$PACKAGE_BASE$$
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,19 @@
# /---------------------------------------------------------------------------
# refreshDefinitionMap: (NotRequired - Default map:{})
#
# If you use synchronizer and specify this property,
# You don't need to refresh(F5) your eclipse project.
#
# Specification:
# map:{
# ; projectName = [Eclipse Project1] / [Eclipse Project2] / ...
# ; requestUrl = [synchronizer's URL]
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
; projectName = $$AutoDetect$$
; requestUrl = http://localhost:8386/
}
# ----------------/

View file

@ -0,0 +1,234 @@
# /---------------------------------------------------------------------------
# replaceSchemaDefinitionMap: (NotRequired - Default map:{})
#
# The various settings about replace-schema.
#
# o repsEnvType: (NotRequired - Default inherits or 'ut')
# o isLoggingInsertSql: (NotRequired - Default true)
# o isLoggingReplaceSql: (NotRequired - Default true)
# o isErrorSqlContinue: (NotRequired - Default false)
# o sqlFileEncoding: (NotRequired - Default 'UTF-8')
# o skipSheet: (NotRequired - Default '')
# o isIncrementSequenceToDataMax: (NotRequired - Default false)
# o isSuppressBatchUpdate: (NotRequired - Default false)
# o objectTypeTargetList: (NotRequired - Default databaseInfoMap's)
# o filterVariablesMap: (NotRequired - Default map:{})
# o additionalUserMap: (NotRequired - Default map:{})
# o additionalDropMapList: (NotRequired - Default list:{})
# o playSqlDirectory: (NotRequired - Default 'playsql')
# o applicationPlaySqlDirectory: (NotRequired - Default '')
# o arrangeBeforeRepsMap: (NotRequired - Default map:{})
# o isSuppressTruncateTable: (NotRequired - Default false)
# o isSuppressDropForeignKey: (NotRequired - Default false)
# o isSuppressDropTable: (NotRequired - Default false)
# o isSuppressDropSequence: (NotRequired - Default false)
# o isSuppressDropProcedure: (NotRequired - Default false)
# o isSuppressDropDBLink: (NotRequired - Default false)
# o initializeFirstSqlList: (NotRequired - Default list:{})
#
# *The line that starts with '#' means comment-out.
#
map:{
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o repsEnvType: (NotRequired - Default inherits or 'ut')
# The environment type of ReplaceSchema.
# e.g. if ut, data files in './playsql/data/ut/...' are loaded
# If DBFlute environment type is specified, inherits it as default.
#
#; repsEnvType = ut
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isLoggingInsertSql: (NotRequired - Default true)
# Does it show insert values on log?
#
#; isLoggingInsertSql = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isLoggingReplaceSql: (NotRequired - Default true)
# Does it show replace-SQL on log?
#
#; isLoggingReplaceSql = true
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isErrorSqlContinue: (NotRequired - Default false)
# Does it continue the task when error SQL exists?
#
#; isErrorSqlContinue = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o sqlFileEncoding: (NotRequired - Default 'UTF-8')
# The encoding of SQL(DDL) file for Replace Schema.
# Basically you don't need this.
#
#; sqlFileEncoding = UTF-8
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o skipSheet: (NotRequired - Default '')
# You can specify the skip sheet by regular expression
#
#; skipSheet = P.+
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isIncrementSequenceToDataMax: (NotRequired - Default false)
# Does it increment sequence values to max value of table data?
# Referring the property 'sequenceDefinitionMap'.
#
#; isIncrementSequenceToDataMax = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressBatchUpdate: (NotRequired - Default false)
# Does it suppress batch update at loading data?
# When you have a data error, you may get details for the error
# by this property changing. Because it is possible that
# the BatchUpdateException information is very short for debug.
#
#; isSuppressBatchUpdate = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o objectTypeTargetList: (NotRequired - Default databaseInfoMap's)
# This property overrides databaseInfoMap's one for ReplaceSchema.
# e.g. Synonym of Oracle --> list:{TABLE ; VIEW ; SYNONYM}
#
#; objectTypeTargetList = list:{TABLE ; VIEW}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o filterVariablesMap: (NotRequired - Default map:{})
# You can specify the filter variables for DDL.
#
#; filterVariablesMap = map:{abc=AAA}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o additionalUserMap: (NotRequired - Default map:{})
# You can set additional users.
# Elements of this map are as below:
# o key of map: User Definition Name (userDefName)
# o url: (NotRequired - Default same as one of main schema)
# o schema: (NotRequired - Default treated as no schema setting)
# o user: (Required)
# o password: password plainly or path to password file (with default password)
# e.g. foo or df:dfprop/system-password.txt|foo
# (NotRequired - Default '')
# o isSkipIfNotFoundPasswordFileAndDefault: Does it skip the user SQL statement
# when using password file but not found it and also default password?
# (NotRequired - Default false)
#
#; additionalUserMap = map:{
# ; system = map:{
# #; url = ...
# #; schema = ...
# ; user = system
# ; password = df:dfprop/system-password.txt
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o additionalDropMapList: (NotRequired - Default list:{})
# You can drop additional other schemas.
# Elements of this map are as below:
# o url: (NotRequired - Default same as main schema)
# o schema: (Required: if empty schema means valid schema, not required)
# o user: (NotRequired - Default same as main schema)
# o password: (NotRequired - Default same as main schema)
# o propertiesMap: (NotRequired - Default map:{})
# o objectTypeTargetList: (NotRequired - Default list{TABLE;VIEW})
#
#; additionalDropMapList = list:{
# ; map:{
# ; url = jdbc:oracle:thin:...
# ; schema = NEXTEXAMPLEDB
# ; user = NEXTEXAMPLEDB
# ; password = NEXTEXAMPLEDB
# ; propertiesMap = map:{}
# ; objectTypeTargetList = list:{TABLE;VIEW}
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o playSqlDirectory: (NotRequired - Default 'playsql' relative to DBFlute client)
# This property is relative path to (main) PlaySql directory,
# You should not use this property easily.
#
#; playSqlDirectory = ../../foo-project/playsql
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o applicationPlaySqlDirectory: (NotRequired - Default '')
# This property is relative path to Application PlaySql directory,
# which is basically used with ApplicationBehavior.
#
#; applicationPlaySqlDirectory = ../../foo-project/dbflute_apbranch/playsql
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o arrangeBeforeRepsMap: (NotRequired - Default map:{})
# You can arrange resource files before ReplaceSchema.
#
#; arrangeBeforeRepsMap = map:{
# ; copy = map:{
# ; ../erd/*.ddl = ./playsql/replace-schema-10-basic.sql
# }
#}
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressTruncateTable: (NotRequired - Default false)
# You can suppress truncating tables at initializing schema.
#
#; isSuppressTruncateTable = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressDropForeignKey: (NotRequired - Default false)
# You can suppress dropping foreign keys at initializing schema.
#
#; isSuppressDropForeignKey = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressDropTable: (NotRequired - Default false)
# You can suppress dropping tables at initializing schema.
#
#; isSuppressDropTable = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressDropSequence: (NotRequired - Default false)
# You can suppress dropping sequences at initializing schema.
#
#; isSuppressDropSequence = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressDropProcedure: (NotRequired - Default false)
# You can suppress dropping procedures at initializing schema.
#
#; isSuppressDropProcedure = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o isSuppressDropDBLink: (NotRequired - Default false)
# You can suppress dropping DB links at initializing schema.
#
#; isSuppressDropDBLink = false
# - - - - - - - - - -/
# /- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
# o initializeFirstSqlList: (NotRequired - Default list:{})
# You can execute the SQL statements before initializing schema.
#
#; initializeFirstSqlList = list:{}
# - - - - - - - - - -/
}
# ----------------/

View file

@ -0,0 +1,25 @@
# /---------------------------------------------------------------------------
# sequenceDefinitionMap: (NotRequired - Default map:{})
#
# The relation mappings between sequence and table.
# If you don't specify the mappings, you cannot insert
# a record of the table by sequence.
# The table names are treated as case insensitive.
#
# Example:
# map:{
# ; PURCHASE = SEQ_PURCHASE
# ; MEMBER = SEQ_MEMBER
# ; MEMBER_LOGIN = SEQ_MEMBER_LOGIN
# ; PRODUCT = SEQ_PRODUCT
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
#; PURCHASE = SEQ_PURCHASE
#; MEMBER = SEQ_MEMBER
#; MEMBER_LOGIN = SEQ_MEMBER_LOGIN
#; PRODUCT = SEQ_PRODUCT
}
# ----------------/

View file

@ -0,0 +1,57 @@
# /---------------------------------------------------------------------------
# typeMappingMap: (NotRequired - Default map:{})
#
# If you want to change mappings from default mappings,
# you can specify your original mappings.
# But it is possible that unanticipated problems occurs, so be careful!
#
# About '$$AutoMapping$$':
# If the database is Oracle, they often use this.
# For example, if you use this for NUMERIC.
# o Numeric( 1 - 9 , 0) is mapping to INTEGER
# o Numeric(10 - 18 , 0) is mapping to BIGINT
# o Numeric(19 - 38 , 0) is mapping to NUMERIC
# o Numeric( 1 - 38 , 2) is mapping to NUMERIC
#
# Example:
# map:{
# ; INTEGER = java.lang.Integer
# ; BIGINT = java.lang.Long
# }
#
# *The line that starts with '#' means comment-out.
#
map:{
# AutoMapping for Numeric and Decimal
; NUMERIC = $$AutoMapping$$ ; DECIMAL = $$AutoMapping$$
}
# ----------------/
#
# Default mapping as follows:
# --------------------------------------------------------
# | JDBC Type | Java Native | CSharp Native |
# | ------------------------------------------------------
# | CHAR | java.lang.String | String |
# | VARCHAR | java.lang.String | String |
# | LONGVARCHAR | java.lang.String | String |
# | NUMERIC | java.math.BigDecimal | decimal? |
# | DECIMAL | java.math.BigDecimal | decimal? |
# | TINYINT | java.lang.Integer | int? |
# | SMALLINT | java.lang.Integer | int? |
# | INTEGER | java.lang.Integer | int? |
# | BIGINT | java.lang.Long | long? |
# | REAL | java.math.BigDecimal | decimal? |
# | FLOAT | java.math.BigDecimal | decimal? |
# | DOUBLE | java.math.BigDecimal | decimal? |
# | DATE | java.util.Date | DateTime? |
# | TIME | java.sql.Time | DateTime? |
# | TIMESTAMP | java.sql.Timestamp | DateTime? |
# | BIT | java.lang.Boolean | bool? |
# | BOOLEAN | java.lang.Boolean | bool? |
# | BINARY | byte[] | byte[] |
# | VARBINARY | byte[] | byte[] |
# | LONGVARBINARY | byte[] | byte[] |
# | BLOB | byte[] | byte[] |
# | ARRAY | *Unsupported | *Unsupported |
# | UUID | java.util.UUID | *Unsupported |
# --------------------------------------------------------

View file

@ -0,0 +1,5 @@
Directory for library extension
If you use a database that DBFlute does not have its JDBC driver,
put your own JDBC driver for the database here.
(e.g. Oracle, DB2, SQLServer)

Binary file not shown.

View file

@ -0,0 +1,173 @@
$manager.info("requestList: ${requestList.size()}")
#foreach ($request in $requestList)
#set ($tableMap = $request.tableMap)
$request.enableOutputDirectory()
$manager.makeDirectory($request.generateDirPath)
#if ($request.isResourceTypeJsonSchema())
#if ($request.requestName == "JsonBeanGen")
##
## <<< Json Schema Gen >>>
##
#foreach ($table in $request.tableList)
#set ($path = "${request.generateDirPath}/bean/bs/Bs${table.camelizedName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./json/BsJsonBean.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/bean/ex/${table.camelizedName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./json/ExJsonBean.vm", $path, "", "")
#end
#end
#end
#elseif ($request.isResourceTypeSolr())
#if ($request.requestName == "SolrBeanGen")
##
## <<< Solr (Xml) Gen >>>
##
#set ($table = $request.table)
#set ($path = "${request.generateDirPath}/bean/bs/${tableMap.baseBeanClassName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./solr/BsSolrBean.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/bean/ex/${tableMap.extendedBeanClassName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./solr/ExSolrBean.vm", $path, "", "")
#end
#end
#elseif ($request.isResourceTypeElasticsearch())
#if ($request.requestName.startsWith("Elasticsearch"))
##
## <<< Elasticsearch Schema Gen >>>
##
#foreach ($table in $request.tableList)
#set ($path = "${request.generateDirPath}/bsentity/dbmeta/${table.camelizedName}Dbm.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/DBMeta.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/bsentity/Bs${table.camelizedName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/BsEntity.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/exentity/${table.camelizedName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./elasticsearch/ExEntity.vm", $path, "", "")
#end
#set ($path = "${request.generateDirPath}/cbean/bs/Bs${table.camelizedName}CB.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/BsConditionBean.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/${table.camelizedName}CB.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./elasticsearch/ExConditionBean.vm", $path, "", "")
#end
#set ($path = "${request.generateDirPath}/cbean/cq/bs/Bs${table.camelizedName}CQ.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/BsConditionQuery.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/cq/${table.camelizedName}CQ.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./elasticsearch/ExConditionQuery.vm", $path, "", "")
#end
#set ($path = "${request.generateDirPath}/cbean/cf/bs/Bs${table.camelizedName}CF.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/BsConditionFilter.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/cf/${table.camelizedName}CF.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./elasticsearch/ExConditionFilter.vm", $path, "", "")
#end
#set ($path = "${request.generateDirPath}/bsbhv/Bs${table.camelizedName}Bhv.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/BsBehavior.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/exbhv/${table.camelizedName}Bhv.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
#if (!$files.file(${generator.outputPath},$path).exists())
$generator.parse("./elasticsearch/ExBehavior.vm", $path, "", "")
#end
#end
#if ($request.requestName == "ElasticsearchFessConfigGen")
#set ($path = "${request.generateDirPath}/cbean/bs/AbstractConditionBean.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/AbstractConditionBean.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/cq/bs/AbstractConditionQuery.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/AbstractConditionQuery.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/cf/bs/AbstractConditionFilter.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/AbstractConditionFilter.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/bsentity/AbstractEntity.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/AbstractEntity.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/bsbhv/AbstractBehavior.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/AbstractBehavior.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/result/EsPagingResultBean.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/EsPagingResultBean.vm", $path, "", "")
#set ($path = "${request.generateDirPath}/cbean/sqlclause/SqlClauseEs.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse("./elasticsearch/SqlClauseEs.vm", $path, "", "")
#end
#end
#else
##
## <<< Normal Gen >>>
##
#if ($request.isOnlyOneTable())
#set ($table = $request.table)
$request.info("parse('${request.generateFilePath}')")
$generator.parse($request.templatePath, $request.generateFilePath, "", "")
#else
#foreach ($table in $request.tableList)
#set ($path = "${request.generateDirPath}/${table.camelizedName}.java")
$manager.makeDirectory($path)
$request.info("parse('${path}')")
$generator.parse($request.templatePath, $path, "", "")
#end
#end
#end
#end

View file

@ -0,0 +1,533 @@
package ${request.package}.bsbhv;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
import java.util.ListIterator;
import java.util.Map;
import java.util.function.Function;
import javax.annotation.Resource;
import ${request.package}.bsentity.AbstractEntity;
import ${request.package}.bsentity.AbstractEntity.DocMeta;
import ${request.package}.bsentity.AbstractEntity.RequestOptionCall;
import ${request.package}.cbean.bs.AbstractConditionBean;
import ${request.package}.cbean.result.EsPagingResultBean;
import org.dbflute.Entity;
import org.dbflute.bhv.AbstractBehaviorWritable;
import org.dbflute.bhv.readable.EntityRowHandler;
import org.dbflute.bhv.writable.DeleteOption;
import org.dbflute.bhv.writable.InsertOption;
import org.dbflute.bhv.writable.UpdateOption;
import org.dbflute.cbean.ConditionBean;
import org.dbflute.cbean.coption.CursorSelectOption;
import org.dbflute.cbean.result.ListResultBean;
import org.dbflute.exception.IllegalBehaviorStateException;
import org.elasticsearch.action.bulk.BulkItemResponse;
import org.elasticsearch.action.bulk.BulkRequestBuilder;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.count.CountRequestBuilder;
import org.elasticsearch.action.delete.DeleteRequestBuilder;
import org.elasticsearch.action.delete.DeleteResponse;
import org.elasticsearch.action.index.IndexRequestBuilder;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.action.search.SearchType;
import org.elasticsearch.action.update.UpdateRequestBuilder;
import org.elasticsearch.client.Client;
import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.SearchHits;
/**
* @author FreeGen
*/
public abstract class AbstractBehavior<ENTITY extends Entity, CB extends ConditionBean> extends AbstractBehaviorWritable<ENTITY, CB> {
@Resource
protected Client client;
protected int sizeForDelete = 100;
protected String scrollForDelete = "1m";
protected int sizeForCursor = 100;
protected String scrollForCursor = "1m";
protected abstract String asEsIndex();
protected abstract String asEsIndexType();
protected abstract String asEsSearchType();
protected abstract <RESULT extends ENTITY> RESULT createEntity(Map<String, Object> source, Class<? extends RESULT> entityType);
@Override
protected int delegateSelectCountUniquely(final ConditionBean cb) {
// TODO check response and cast problem
final CountRequestBuilder builder = client.prepareCount(asEsIndex()).setTypes(asEsSearchType());
return (int) ((AbstractConditionBean) cb).build(builder).execute().actionGet().getCount();
}
@Override
protected <RESULT extends ENTITY> List<RESULT> delegateSelectList(final ConditionBean cb, final Class<? extends RESULT> entityType) {
// TODO check response
final SearchRequestBuilder builder = client.prepareSearch(asEsIndex()).setTypes(asEsSearchType());
if (cb.isFetchScopeEffective()) {
builder.setFrom(cb.getFetchStartIndex());
builder.setSize(cb.getFetchSize());
}
((AbstractConditionBean) cb).request().build(builder);
final SearchResponse response = ((AbstractConditionBean) cb).build(builder).execute().actionGet();
final EsPagingResultBean<RESULT> list = new EsPagingResultBean<>();
final SearchHits searchHits = response.getHits();
searchHits.forEach(hit -> {
final Map<String, Object> source = hit.getSource();
final RESULT entity = createEntity(source, entityType);
final DocMeta docMeta = ((AbstractEntity) entity).asDocMeta();
docMeta.id(hit.getId());
docMeta.version(hit.getVersion());
list.add(entity);
});
list.setAllRecordCount((int) searchHits.totalHits());
list.setPageSize(cb.getFetchSize());
list.setCurrentPageNumber(cb.getFetchPageNumber());
list.setTook(response.getTookInMillis());
list.setTotalShards(response.getTotalShards());
list.setSuccessfulShards(response.getSuccessfulShards());
list.setFailedShards(response.getFailedShards());
// TODO others
return list;
}
@Override
protected <RESULT extends ENTITY> void helpSelectCursorHandlingByPaging(CB cb, EntityRowHandler<RESULT> handler,
Class<? extends RESULT> entityType, CursorSelectOption option) {
delegateSelectCursor(cb, handler, entityType);
}
@Override
protected <RESULT extends ENTITY> void delegateSelectCursor(final ConditionBean cb, final EntityRowHandler<RESULT> handler,
final Class<? extends RESULT> entityType) {
delegateBulkRequest(cb, searchHits -> {
searchHits.forEach(hit -> {
if (handler.isBreakCursor()) {
return;
}
final Map<String, Object> source = hit.getSource();
final RESULT entity = createEntity(source, entityType);
final DocMeta docMeta = ((AbstractEntity) entity).asDocMeta();
docMeta.id(hit.getId());
docMeta.version(hit.getVersion());
handler.handle(entity);
});
return !handler.isBreakCursor();
});
}
protected <RESULT extends ENTITY> void delegateSelectBulk(final ConditionBean cb, final EntityRowHandler<List<RESULT>> handler,
final Class<? extends RESULT> entityType) {
assertCBStateValid(cb);
assertObjectNotNull("entityRowHandler", handler);
assertSpecifyDerivedReferrerEntityProperty(cb, entityType);
assertObjectNotNull("entityRowHandler", handler);
delegateBulkRequest(cb, searchHits -> {
List<RESULT> list = new ArrayList<>();
searchHits.forEach(hit -> {
final Map<String, Object> source = hit.getSource();
final RESULT entity = createEntity(source, entityType);
final DocMeta docMeta = ((AbstractEntity) entity).asDocMeta();
docMeta.id(hit.getId());
docMeta.version(hit.getVersion());
list.add(entity);
});
handler.handle(list);
return !handler.isBreakCursor();
});
}
protected void delegateBulkRequest(final ConditionBean cb, Function<SearchHits, Boolean> handler) {
final SearchRequestBuilder builder = client.prepareSearch(asEsIndex()).setTypes(asEsIndexType()).setSearchType(SearchType.SCAN)
.setScroll(scrollForCursor).setSize(sizeForCursor);
((AbstractConditionBean) cb).request().build(builder);
final SearchResponse response = ((AbstractConditionBean) cb).build(builder).execute().actionGet();
String scrollId = response.getScrollId();
while (scrollId != null) {
final SearchResponse scrollResponse = client.prepareSearchScroll(scrollId).setScroll(scrollForDelete).execute().actionGet();
scrollId = scrollResponse.getScrollId();
final SearchHits searchHits = scrollResponse.getHits();
final SearchHit[] hits = searchHits.getHits();
if (hits.length == 0) {
scrollId = null;
break;
}
if (!handler.apply(searchHits)) {
break;
}
}
}
@Override
protected Number doReadNextVal() {
final String msg = "This table is NOT related to sequence: " + asEsIndexType();
throw new UnsupportedOperationException(msg);
}
@Override
protected <RESULT extends Entity> ListResultBean<RESULT> createListResultBean(final ConditionBean cb, final List<RESULT> selectedList) {
if (selectedList instanceof EsPagingResultBean) {
return (ListResultBean<RESULT>) selectedList;
}
throw new IllegalBehaviorStateException("selectedList is not EsPagingResultBean.");
}
@Override
protected int delegateInsert(final Entity entity, final InsertOption<? extends ConditionBean> option) {
final AbstractEntity esEntity = (AbstractEntity) entity;
IndexRequestBuilder builder = createInsertRequest(esEntity);
final IndexResponse response = builder.execute().actionGet();
esEntity.asDocMeta().id(response.getId());
return response.isCreated() ? 1 : 0;
}
protected IndexRequestBuilder createInsertRequest(final AbstractEntity esEntity) {
final IndexRequestBuilder builder = client.prepareIndex(asEsIndex(), asEsIndexType()).setSource(esEntity.toSource());
final RequestOptionCall<IndexRequestBuilder> indexOption = esEntity.asDocMeta().indexOption();
if (indexOption != null) {
indexOption.callback(builder);
}
return builder;
}
@Override
protected int delegateUpdate(final Entity entity, final UpdateOption<? extends ConditionBean> option) {
final AbstractEntity esEntity = (AbstractEntity) entity;
final IndexRequestBuilder builder = createUpdateRequest(esEntity);
final IndexResponse response = builder.execute().actionGet();
long version = response.getVersion();
if (version != -1) {
esEntity.asDocMeta().version(version);
}
return 1;
}
protected IndexRequestBuilder createUpdateRequest(final AbstractEntity esEntity) {
final IndexRequestBuilder builder =
client.prepareIndex(asEsIndex(), asEsIndexType(), esEntity.asDocMeta().id()).setSource(esEntity.toSource());
final RequestOptionCall<IndexRequestBuilder> indexOption = esEntity.asDocMeta().indexOption();
if (indexOption != null) {
indexOption.callback(builder);
}
final Long version = esEntity.asDocMeta().version();
if (version != null && version.longValue() != -1) {
builder.setVersion(version);
}
return builder;
}
@Override
protected int delegateDelete(final Entity entity, final DeleteOption<? extends ConditionBean> option) {
final AbstractEntity esEntity = (AbstractEntity) entity;
final DeleteRequestBuilder builder = createDeleteRequest(esEntity);
final DeleteResponse response = builder.execute().actionGet();
return response.isFound() ? 1 : 0;
}
protected DeleteRequestBuilder createDeleteRequest(final AbstractEntity esEntity) {
final DeleteRequestBuilder builder = client.prepareDelete(asEsIndex(), asEsIndexType(), esEntity.asDocMeta().id());
final RequestOptionCall<DeleteRequestBuilder> deleteOption = esEntity.asDocMeta().deleteOption();
if (deleteOption != null) {
deleteOption.callback(builder);
}
return builder;
}
@Override
protected int delegateQueryDelete(final ConditionBean cb, final DeleteOption<? extends ConditionBean> option) {
final SearchRequestBuilder builder = client.prepareSearch(asEsIndex()).setTypes(asEsIndexType()).setSearchType(SearchType.SCAN)
.setScroll(scrollForDelete).setSize(sizeForDelete);
((AbstractConditionBean) cb).request().build(builder);
final SearchResponse response = ((AbstractConditionBean) cb).build(builder).execute().actionGet();
int count = 0;
String scrollId = response.getScrollId();
while (scrollId != null) {
final SearchResponse scrollResponse = client.prepareSearchScroll(scrollId).setScroll(scrollForDelete).execute().actionGet();
scrollId = scrollResponse.getScrollId();
final SearchHits searchHits = scrollResponse.getHits();
final SearchHit[] hits = searchHits.getHits();
if (hits.length == 0) {
scrollId = null;
break;
}
final BulkRequestBuilder bulkRequest = client.prepareBulk();
for (final SearchHit hit : hits) {
bulkRequest.add(client.prepareDelete(asEsIndex(), asEsIndexType(), hit.getId()));
}
count += hits.length;
final BulkResponse bulkResponse = bulkRequest.execute().actionGet();
if (bulkResponse.hasFailures()) {
throw new IllegalBehaviorStateException(bulkResponse.buildFailureMessage());
}
}
return count;
}
protected int[] delegateBatchInsert(final List<? extends Entity> entityList, final InsertOption<? extends ConditionBean> option) {
if (entityList.isEmpty()) {
return new int[] {};
}
return delegateBatchRequest(entityList, esEntity -> {
return createInsertRequest(esEntity);
});
}
protected int[] delegateBatchUpdate(List<? extends Entity> entityList, UpdateOption<? extends ConditionBean> option) {
if (entityList.isEmpty()) {
return new int[] {};
}
return delegateBatchRequest(entityList, esEntity -> {
return createUpdateRequest(esEntity);
});
}
protected int[] delegateBatchDelete(List<? extends Entity> entityList, DeleteOption<? extends ConditionBean> option) {
if (entityList.isEmpty()) {
return new int[] {};
}
return delegateBatchRequest(entityList, esEntity -> {
return createDeleteRequest(esEntity);
});
}
protected <BUILDER> int[] delegateBatchRequest(final List<? extends Entity> entityList, Function<AbstractEntity, BUILDER> call) {
final BulkList<? extends Entity> bulkList = (BulkList<? extends Entity>) entityList;
final BulkRequestBuilder bulkBuilder = client.prepareBulk();
for (final Entity entity : entityList) {
final AbstractEntity esEntity = (AbstractEntity) entity;
BUILDER builder = call.apply(esEntity);
if (builder instanceof IndexRequestBuilder) {
bulkBuilder.add((IndexRequestBuilder) builder);
} else if (builder instanceof UpdateRequestBuilder) {
bulkBuilder.add((UpdateRequestBuilder) builder);
} else if (builder instanceof DeleteRequestBuilder) {
bulkBuilder.add((DeleteRequestBuilder) builder);
}
}
RequestOptionCall<BulkRequestBuilder> builderCall = bulkList.getCall();
if (builderCall != null) {
builderCall.callback(bulkBuilder);
}
BulkResponse response = bulkBuilder.execute().actionGet();
List<Integer> resultList = new ArrayList<>();
for (BulkItemResponse itemResponse : response.getItems()) {
resultList.add(itemResponse.isFailed() ? 0 : 1);
}
int[] results = new int[resultList.size()];
for (int i = 0; i < resultList.size(); i++) {
results[i] = resultList.get(i);
}
return results;
}
protected static String toString(final Object value) {
if (value != null) {
return value.toString();
} else {
return null;
}
}
protected static Short toShort(final Object value) {
if (value instanceof Number) {
return ((Number) value).shortValue();
} else if (value instanceof String) {
return Short.parseShort(value.toString());
} else {
return null;
}
}
protected static Integer toInteger(final Object value) {
if (value instanceof Number) {
return ((Number) value).intValue();
} else if (value instanceof String) {
return Integer.parseInt(value.toString());
} else {
return null;
}
}
protected static Long toLong(final Object value) {
if (value instanceof Number) {
return ((Number) value).longValue();
} else if (value instanceof String) {
return Long.parseLong(value.toString());
} else {
return null;
}
}
protected static Float toFloat(final Object value) {
if (value instanceof Number) {
return ((Number) value).floatValue();
} else if (value instanceof String) {
return Float.parseFloat(value.toString());
} else {
return null;
}
}
protected static Double toDouble(final Object value) {
if (value instanceof Number) {
return ((Number) value).doubleValue();
} else if (value instanceof String) {
return Double.parseDouble(value.toString());
} else {
return null;
}
}
protected static Boolean toBoolean(final Object value) {
if (value instanceof Boolean) {
return ((Boolean) value).booleanValue();
} else if (value instanceof String) {
return Boolean.parseBoolean(value.toString());
} else {
return null;
}
}
public static class BulkList<E> implements List<E> {
private final List<E> parent;
private final RequestOptionCall<BulkRequestBuilder> call;
public BulkList(final List<E> parent, final RequestOptionCall<BulkRequestBuilder> call) {
this.parent = parent;
this.call = call;
}
public int size() {
return parent.size();
}
public boolean isEmpty() {
return parent.isEmpty();
}
public boolean contains(final Object o) {
return parent.contains(o);
}
public Iterator<E> iterator() {
return parent.iterator();
}
public Object[] toArray() {
return parent.toArray();
}
public <T> T[] toArray(final T[] a) {
return parent.toArray(a);
}
public boolean add(final E e) {
return parent.add(e);
}
public boolean remove(final Object o) {
return parent.remove(o);
}
public boolean containsAll(final Collection<?> c) {
return parent.containsAll(c);
}
public boolean addAll(final Collection<? extends E> c) {
return parent.addAll(c);
}
public boolean addAll(final int index, final Collection<? extends E> c) {
return parent.addAll(index, c);
}
public boolean removeAll(final Collection<?> c) {
return parent.removeAll(c);
}
public boolean retainAll(final Collection<?> c) {
return parent.retainAll(c);
}
public void clear() {
parent.clear();
}
public boolean equals(final Object o) {
return parent.equals(o);
}
public int hashCode() {
return parent.hashCode();
}
public E get(final int index) {
return parent.get(index);
}
public E set(final int index, final E element) {
return parent.set(index, element);
}
public void add(final int index, final E element) {
parent.add(index, element);
}
public E remove(final int index) {
return parent.remove(index);
}
public int indexOf(final Object o) {
return parent.indexOf(o);
}
public int lastIndexOf(final Object o) {
return parent.lastIndexOf(o);
}
public ListIterator<E> listIterator() {
return parent.listIterator();
}
public ListIterator<E> listIterator(final int index) {
return parent.listIterator(index);
}
public List<E> subList(final int fromIndex, final int toIndex) {
return parent.subList(fromIndex, toIndex);
}
public RequestOptionCall<BulkRequestBuilder> getCall() {
return call;
}
}
}

View file

@ -0,0 +1,659 @@
package ${request.package}.cbean.bs;
import ${request.package}.cbean.sqlclause.SqlClauseEs;
import org.dbflute.cbean.ConditionBean;
import org.dbflute.cbean.chelper.HpCBPurpose;
import org.dbflute.cbean.chelper.HpColumnSpHandler;
import org.dbflute.cbean.coption.CursorSelectOption;
import org.dbflute.cbean.coption.ScalarSelectOption;
import org.dbflute.cbean.coption.StatementConfigCall;
import org.dbflute.cbean.dream.SpecifiedColumn;
import org.dbflute.cbean.exception.ConditionBeanExceptionThrower;
import org.dbflute.cbean.ordering.OrderByBean;
import org.dbflute.cbean.paging.PagingBean;
import org.dbflute.cbean.paging.PagingInvoker;
import org.dbflute.cbean.scoping.AndQuery;
import org.dbflute.cbean.scoping.ModeQuery;
import org.dbflute.cbean.scoping.OrQuery;
import org.dbflute.cbean.scoping.UnionQuery;
import org.dbflute.cbean.sqlclause.SqlClause;
import org.dbflute.cbean.sqlclause.orderby.OrderByClause;
import org.dbflute.dbmeta.DBMeta;
import org.dbflute.dbmeta.accessory.DerivedTypeHandler;
import org.dbflute.jdbc.StatementConfig;
import org.dbflute.system.DBFluteSystem;
import org.dbflute.twowaysql.style.BoundDateDisplayStyle;
import org.elasticsearch.action.count.CountRequestBuilder;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.common.unit.TimeValue;
/**
* @author FreeGen
*/
public abstract class AbstractConditionBean implements ConditionBean {
protected int _safetyMaxResultSize;
protected final SqlClause _sqlClause = new SqlClauseEs(asTableDbName());
private SearchRequestParams _searchRequestParams = new SearchRequestParams();
public abstract CountRequestBuilder build(CountRequestBuilder builder);
public abstract SearchRequestBuilder build(SearchRequestBuilder builder);
@Override
public boolean isPaging() {
return false;
}
@Override
public boolean canPagingCountLater() {
return false;
}
@Override
public boolean canPagingReSelect() {
return true;
}
@Override
public void paging(int pageSize, int pageNumber) {
if (pageSize <= 0) {
throwPagingPageSizeNotPlusException(pageSize, pageNumber);
}
fetchFirst(pageSize);
xfetchPage(pageNumber);
}
protected void throwPagingPageSizeNotPlusException(int pageSize, int pageNumber) {
createCBExThrower().throwPagingPageSizeNotPlusException(this, pageSize, pageNumber);
}
protected ConditionBeanExceptionThrower createCBExThrower() {
return new ConditionBeanExceptionThrower();
}
protected void assertObjectNotNull(String variableName, Object value) {
if (variableName == null) {
String msg = "The value should not be null: variableName=null value=" + value;
throw new IllegalArgumentException(msg);
}
if (value == null) {
String msg = "The value should not be null: variableName=" + variableName;
throw new IllegalArgumentException(msg);
}
}
@Override
public void xsetPaging(boolean paging) {
// Do nothing because this is unsupported on ConditionBean.
// And it is possible that this method is called by PagingInvoker.
}
@Override
public void enablePagingCountLater() {
// nothing
}
@Override
public void disablePagingCountLater() {
// nothing
}
@Override
public void enablePagingReSelect() {
// nothing
}
@Override
public void disablePagingReSelect() {
// nothing
}
@Override
public PagingBean fetchFirst(int fetchSize) {
getSqlClause().fetchFirst(fetchSize);
return this;
}
@Override
public PagingBean xfetchScope(int fetchStartIndex, int fetchSize) {
getSqlClause().fetchScope(fetchStartIndex, fetchSize);
return this;
}
@Override
public PagingBean xfetchPage(int fetchPageNumber) {
getSqlClause().fetchPage(fetchPageNumber);
return this;
}
protected String ln() {
return DBFluteSystem.ln();
}
@Override
public <ENTITY> PagingInvoker<ENTITY> createPagingInvoker(String tableDbName) {
// TODO Auto-generated method stub
return null;
}
@Override
public int getFetchStartIndex() {
return getSqlClause().getFetchStartIndex();
}
@Override
public int getFetchSize() {
return getSqlClause().getFetchSize();
}
@Override
public int getFetchPageNumber() {
return getSqlClause().getFetchPageNumber();
}
@Override
public int getPageStartIndex() {
return getSqlClause().getPageStartIndex();
}
@Override
public int getPageEndIndex() {
return getSqlClause().getPageEndIndex();
}
@Override
public boolean isFetchScopeEffective() {
return getSqlClause().isFetchScopeEffective();
}
@Override
public int getFetchNarrowingSkipStartIndex() {
return getPageStartIndex();
}
@Override
public int getFetchNarrowingLoopCount() {
return getFetchSize();
}
@Override
public boolean isFetchNarrowingSkipStartIndexEffective() {
return false;
}
@Override
public boolean isFetchNarrowingLoopCountEffective() {
return false;
}
@Override
public boolean isFetchNarrowingEffective() {
return getSqlClause().isFetchNarrowingEffective();
}
@Override
public void xdisableFetchNarrowing() {
// no need to disable in ConditionBean, basically for OutsideSql
String msg = "This method is unsupported on ConditionBean!";
throw new UnsupportedOperationException(msg);
}
@Override
public void xenableIgnoredFetchNarrowing() {
// do nothing
}
@Override
public void checkSafetyResult(int safetyMaxResultSize) {
_safetyMaxResultSize = safetyMaxResultSize;
}
@Override
public int getSafetyMaxResultSize() {
return _safetyMaxResultSize;
}
@Override
public String getOrderByClause() {
return null;
}
@Override
public OrderByClause getOrderByComponent() {
return null;
}
@Override
public OrderByBean clearOrderBy() {
return null;
}
@Override
public void overTheWaves(SpecifiedColumn dreamCruiseTicket) {
// do nothing
}
@Override
public void mysticRhythms(Object mysticBinding) {
// do nothing
}
@Override
public DBMeta asDBMeta() {
return null;
}
@Override
public SqlClause getSqlClause() {
return _sqlClause;
}
@Override
public ConditionBean addOrderBy_PK_Asc() {
return null;
}
@Override
public ConditionBean addOrderBy_PK_Desc() {
return null;
}
@Override
public HpColumnSpHandler localSp() {
return null;
}
@Override
public void enableInnerJoinAutoDetect() {
// do nothing
}
@Override
public void disableInnerJoinAutoDetect() {
// do nothing
}
@Override
public SpecifiedColumn inviteDerivedToDreamCruise(String derivedAlias) {
return null;
}
@Override
public ConditionBean xcreateDreamCruiseCB() {
return null;
}
@Override
public void xmarkAsDeparturePortForDreamCruise() {
// do nothing
}
@Override
public boolean xisDreamCruiseDeparturePort() {
return false;
}
@Override
public boolean xisDreamCruiseShip() {
return false;
}
@Override
public ConditionBean xgetDreamCruiseDeparturePort() {
return null;
}
@Override
public boolean xhasDreamCruiseTicket() {
return false;
}
@Override
public SpecifiedColumn xshowDreamCruiseTicket() {
return null;
}
@Override
public void xkeepDreamCruiseJourneyLogBook(String relationPath) {
// do nothing
}
@Override
public void xsetupSelectDreamCruiseJourneyLogBook() {
// do nothing
}
@Override
public void xsetupSelectDreamCruiseJourneyLogBookIfUnionExists() {
// do nothing
}
@Override
public Object xgetMysticBinding() {
return null;
}
@Override
public void ignoreNullOrEmptyQuery() {
// TODO
}
@Override
public void checkNullOrEmptyQuery() {
// TODO
}
@Override
public void enableEmptyStringQuery(ModeQuery noArgInLambda) {
// do nothing
}
@Override
public void disableEmptyStringQuery() {
// TODO
}
@Override
public void enableOverridingQuery(ModeQuery noArgInLambda) {
// do nothing
}
@Override
public void disableOverridingQuery() {
// do nothing
}
@Override
public void enablePagingCountLeastJoin() {
// do nothing
}
@Override
public void disablePagingCountLeastJoin() {
// do nothing
}
@Override
public boolean canPagingSelectAndQuerySplit() {
return false;
}
@Override
public ConditionBean lockForUpdate() {
return null;
}
@Override
public ConditionBean xsetupSelectCountIgnoreFetchScope(boolean uniqueCount) {
return null;
}
@Override
public ConditionBean xafterCareSelectCountIgnoreFetchScope() {
return null;
}
@Override
public boolean isSelectCountIgnoreFetchScope() {
return false;
}
@Override
public CursorSelectOption getCursorSelectOption() {
return null;
}
@Override
public void xacceptScalarSelectOption(ScalarSelectOption option) {
// do nothing
}
@Override
public void configure(StatementConfigCall<StatementConfig> confLambda) {
// do nothing
}
@Override
public StatementConfig getStatementConfig() {
return null;
}
@Override
public boolean canRelationMappingCache() {
return false;
}
@Override
public void enableNonSpecifiedColumnAccess() {
// do nothing
}
@Override
public void disableNonSpecifiedColumnAccess() {
// do nothing
}
@Override
public boolean isNonSpecifiedColumnAccessAllowed() {
return false;
}
@Override
public void enableColumnNullObject() {
// TODO
}
@Override
public void disableColumnNullObject() {
// TODO
}
@Override
public void enableQueryUpdateCountPreCheck() {
// do nothing
}
@Override
public void disableQueryUpdateCountPreCheck() {
// do nothing
}
@Override
public boolean isQueryUpdateCountPreCheck() {
return false;
}
@Override
public String toDisplaySql() {
// TODO
return null;
}
@Override
public void styleLogDateDisplay(BoundDateDisplayStyle logDateDisplayStyle) {
// do nothing
}
@Override
public BoundDateDisplayStyle getLogDateDisplayStyle() {
return null;
}
@Override
public boolean hasWhereClauseOnBaseQuery() {
return false;
}
@Override
public void clearWhereClauseOnBaseQuery() {
// do nothing
}
@Override
public boolean hasSelectAllPossible() {
return false;
}
@Override
public boolean hasOrderByClause() {
return false;
}
@Override
public boolean hasUnionQueryOrUnionAllQuery() {
return false;
}
@Override
public void invokeSetupSelect(String foreignPropertyNamePath) {
// do nothing
}
@Override
public SpecifiedColumn invokeSpecifyColumn(String columnPropertyPath) {
return null;
}
@Override
public void invokeOrScopeQuery(OrQuery<ConditionBean> orQuery) {
// do nothing
}
@Override
public void invokeOrScopeQueryAndPart(AndQuery<ConditionBean> andQuery) {
// do nothing
}
@Override
public void xregisterUnionQuerySynchronizer(UnionQuery<ConditionBean> unionQuerySynchronizer) {
// do nothing
}
@Override
public DerivedTypeHandler xgetDerivedTypeHandler() {
return null;
}
@Override
public HpCBPurpose getPurpose() {
return null;
}
@Override
public void xsetupForScalarSelect() {
// do nothing
}
@Override
public void xsetupForQueryInsert() {
// do nothing
}
@Override
public void xsetupForSpecifiedUpdate() {
// do nothing
}
@Override
public void xsetupForVaryingUpdate() {
// do nothing
}
@Override
public void enableThatsBadTiming() {
// do nothing
}
@Override
public void disableThatsBadTiming() {
// do nothing
}
public SearchRequestParams request() {
return _searchRequestParams;
}
public static class SearchRequestParams {
private Boolean explain;
private Float minScore;
private String preference;
private String routing;
private String searchType;
private long timeoutInMillis = -1;
private Boolean version;
private int terminateAfter = 0;
public void build(SearchRequestBuilder builder) {
if (explain != null) {
builder.setExplain(explain);
}
if (minScore != null) {
builder.setMinScore(minScore);
}
if (preference != null) {
builder.setPreference(preference);
}
if (routing != null) {
builder.setRouting(routing);
}
if (searchType != null) {
builder.setSearchType(searchType);
}
if (timeoutInMillis == -1) {
builder.setTimeout(new TimeValue(timeoutInMillis));
}
if (version != null) {
builder.setVersion(version);
}
if (terminateAfter > 0) {
builder.setTerminateAfter(terminateAfter);
}
}
public void setExplain(boolean explain) {
this.explain = explain;
}
public void setMinScore(float minScore) {
this.minScore = minScore;
}
public void setPreference(String preference) {
this.preference = preference;
}
public void setRouting(String routing) {
this.routing = routing;
}
public void setSearchType(String searchType) {
this.searchType = searchType;
}
public void setTimeoutInMillis(long timeoutInMillis) {
this.timeoutInMillis = timeoutInMillis;
}
public void setVersion(boolean version) {
this.version = version;
}
public void setTerminateAfter(int terminateAfter) {
this.terminateAfter = terminateAfter;
}
}
}

View file

@ -0,0 +1,216 @@
package ${request.package}.cbean.cf.bs;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import org.dbflute.cbean.ckey.ConditionKey;
import org.elasticsearch.index.query.AndFilterBuilder;
import org.elasticsearch.index.query.BoolFilterBuilder;
import org.elasticsearch.index.query.ExistsFilterBuilder;
import org.elasticsearch.index.query.FilterBuilder;
import org.elasticsearch.index.query.FilterBuilders;
import org.elasticsearch.index.query.IdsFilterBuilder;
import org.elasticsearch.index.query.MatchAllFilterBuilder;
import org.elasticsearch.index.query.MissingFilterBuilder;
import org.elasticsearch.index.query.NotFilterBuilder;
import org.elasticsearch.index.query.OrFilterBuilder;
import org.elasticsearch.index.query.PrefixFilterBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryFilterBuilder;
import org.elasticsearch.index.query.RangeFilterBuilder;
import org.elasticsearch.index.query.ScriptFilterBuilder;
import org.elasticsearch.index.query.TermFilterBuilder;
import org.elasticsearch.index.query.TermsFilterBuilder;
public class AbstractConditionFilter {
protected List<FilterBuilder> filterBuilderList;
public boolean hasFilters() {
return filterBuilderList != null && !filterBuilderList.isEmpty();
}
public FilterBuilder getFilter() {
if (filterBuilderList == null) {
return null;
} else if (filterBuilderList.size() == 1) {
return filterBuilderList.get(0);
}
return FilterBuilders.andFilter(filterBuilderList.toArray(new FilterBuilder[filterBuilderList.size()]));
}
public void addFilter(FilterBuilder filterBuilder) {
regF(filterBuilder);
}
public void setIds_Equal(Collection<String> idList) {
setIds_Equal(idList, null);
}
public void setIds_Equal(Collection<String> idList, ConditionOptionCall<IdsFilterBuilder> opLambda) {
IdsFilterBuilder builder = regIdsF(idList);
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void matchAll() {
matchAll(null);
}
public void matchAll(ConditionOptionCall<MatchAllFilterBuilder> opLambda) {
MatchAllFilterBuilder builder = FilterBuilders.matchAllFilter();
regF(builder);
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void setScript(String script) {
setScript(script, null);
}
public void setScript(String script, ConditionOptionCall<ScriptFilterBuilder> opLambda) {
ScriptFilterBuilder builder = regScriptF(script);
if (opLambda != null) {
opLambda.callback(builder);
}
}
protected void regF(FilterBuilder builder) {
if (filterBuilderList == null) {
filterBuilderList = new ArrayList<>();
}
filterBuilderList.add(builder);
}
protected TermFilterBuilder regTermF(String name, Object value) {
TermFilterBuilder termFilter = FilterBuilders.termFilter(name, value);
regF(termFilter);
return termFilter;
}
protected TermsFilterBuilder regTermsF(String name, Collection<?> value) {
TermsFilterBuilder termsFilter = FilterBuilders.termsFilter(name, value);
regF(termsFilter);
return termsFilter;
}
protected PrefixFilterBuilder regPrefixF(String name, String value) {
PrefixFilterBuilder prefixFilter = FilterBuilders.prefixFilter(name, value);
regF(prefixFilter);
return prefixFilter;
}
protected ExistsFilterBuilder regExistsF(String name) {
ExistsFilterBuilder existsFilter = FilterBuilders.existsFilter(name);
regF(existsFilter);
return existsFilter;
}
protected MissingFilterBuilder regMissingF(String name) {
MissingFilterBuilder missingFilter = FilterBuilders.missingFilter(name);
regF(missingFilter);
return missingFilter;
}
protected RangeFilterBuilder regRangeF(String name, ConditionKey ck, Object value) {
for (FilterBuilder builder : filterBuilderList) {
if (builder instanceof RangeFilterBuilder) {
RangeFilterBuilder rangeFilterBuilder = (RangeFilterBuilder) builder;
if (rangeFilterBuilder.toString().replaceAll("\\s", "").startsWith("{\"range\":{\"" + name + "\"")) {
addRangeC(rangeFilterBuilder, ck, value);
return rangeFilterBuilder;
}
}
}
RangeFilterBuilder rangeFilterBuilder = FilterBuilders.rangeFilter(name);
addRangeC(rangeFilterBuilder, ck, value);
regF(rangeFilterBuilder);
return rangeFilterBuilder;
}
protected void addRangeC(RangeFilterBuilder builder, ConditionKey ck, Object value) {
if (ck.equals(ConditionKey.CK_GREATER_THAN)) {
builder.gt(value);
} else if (ck.equals(ConditionKey.CK_GREATER_EQUAL)) {
builder.gte(value);
} else if (ck.equals(ConditionKey.CK_LESS_THAN)) {
builder.lt(value);
} else if (ck.equals(ConditionKey.CK_LESS_EQUAL)) {
builder.lte(value);
}
}
protected ScriptFilterBuilder regScriptF(String script) {
ScriptFilterBuilder scriptFilter = FilterBuilders.scriptFilter(script);
regF(scriptFilter);
return scriptFilter;
}
protected IdsFilterBuilder regIdsF(Collection<?> value) {
IdsFilterBuilder idsFilter = FilterBuilders.idsFilter(value.toArray(new String[value.size()]));
regF(idsFilter);
return idsFilter;
}
protected BoolFilterBuilder regBoolF(List<FilterBuilder> mustList, List<FilterBuilder> shouldList, List<FilterBuilder> mustNotList) {
BoolFilterBuilder boolFilter = FilterBuilders.boolFilter();
mustList.forEach(query -> {
boolFilter.must(query);
});
shouldList.forEach(query -> {
boolFilter.should(query);
});
mustNotList.forEach(query -> {
boolFilter.mustNot(query);
});
return boolFilter;
}
protected AndFilterBuilder regAndF(List<FilterBuilder> filterList) {
AndFilterBuilder andFilter = FilterBuilders.andFilter(filterList.toArray(new FilterBuilder[filterList.size()]));
regF(andFilter);
return andFilter;
}
protected OrFilterBuilder regOrF(List<FilterBuilder> filterList) {
OrFilterBuilder andFilter = FilterBuilders.orFilter(filterList.toArray(new FilterBuilder[filterList.size()]));
regF(andFilter);
return andFilter;
}
protected NotFilterBuilder regNotF(FilterBuilder filter) {
NotFilterBuilder notFilter = FilterBuilders.notFilter(filter);
regF(notFilter);
return notFilter;
}
protected QueryFilterBuilder regQueryF(QueryBuilder filter) {
QueryFilterBuilder queryFilter = FilterBuilders.queryFilter(filter);
regF(queryFilter);
return queryFilter;
}
@FunctionalInterface
public interface ConditionOptionCall<OP extends FilterBuilder> {
/**
* @param op The option of condition to be set up. (NotNull)
*/
void callback(OP op);
}
@FunctionalInterface
public interface BoolCall<CF extends AbstractConditionFilter> {
void callback(CF must, CF should, CF mustNot);
}
@FunctionalInterface
public interface OperatorCall<CF extends AbstractConditionFilter> {
void callback(CF and);
}
}

View file

@ -0,0 +1,385 @@
package ${request.package}.cbean.cq.bs;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import ${request.package}.cbean.cf.bs.AbstractConditionFilter;
import org.dbflute.cbean.ConditionBean;
import org.dbflute.cbean.ConditionQuery;
import org.dbflute.cbean.ckey.ConditionKey;
import org.dbflute.cbean.coption.ConditionOption;
import org.dbflute.cbean.coption.ParameterOption;
import org.dbflute.cbean.cvalue.ConditionValue;
import org.dbflute.cbean.sqlclause.SqlClause;
import org.dbflute.dbmeta.info.ColumnInfo;
import org.dbflute.dbmeta.name.ColumnRealName;
import org.dbflute.dbmeta.name.ColumnSqlName;
import org.dbflute.util.Srl;
import org.elasticsearch.index.query.BoolQueryBuilder;
import org.elasticsearch.index.query.FilterBuilder;
import org.elasticsearch.index.query.FilteredQueryBuilder;
import org.elasticsearch.index.query.FuzzyQueryBuilder;
import org.elasticsearch.index.query.MatchAllQueryBuilder;
import org.elasticsearch.index.query.MatchQueryBuilder;
import org.elasticsearch.index.query.PrefixQueryBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.index.query.QueryStringQueryBuilder;
import org.elasticsearch.index.query.RangeQueryBuilder;
import org.elasticsearch.index.query.TermQueryBuilder;
import org.elasticsearch.index.query.TermsQueryBuilder;
import org.elasticsearch.search.sort.FieldSortBuilder;
import org.elasticsearch.search.sort.SortBuilders;
import org.elasticsearch.search.sort.SortOrder;
/**
* @author FreeGen
*/
public abstract class AbstractConditionQuery implements ConditionQuery {
protected static final String CQ_PROPERTY = "conditionQuery";
protected List<QueryBuilder> queryBuilderList;
protected List<FieldSortBuilder> fieldSortBuilderList;
private DocMetaCQ docMetaCQ;
public DocMetaCQ docMeta() {
if (docMetaCQ == null) {
docMetaCQ = new DocMetaCQ();
}
return docMetaCQ;
}
public List<FieldSortBuilder> getFieldSortBuilderList() {
return fieldSortBuilderList == null ? Collections.emptyList() : fieldSortBuilderList;
}
public boolean hasQueries() {
return queryBuilderList != null && !queryBuilderList.isEmpty();
}
public QueryBuilder getQuery() {
if (queryBuilderList == null) {
return null;
} else if (queryBuilderList.size() == 1) {
return queryBuilderList.get(0);
}
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery();
queryBuilderList.forEach(query -> {
boolQuery.must(query);
});
return boolQuery;
}
public void addQuery(QueryBuilder queryBuilder) {
regQ(queryBuilder);
}
public void queryString(String queryString) {
queryString(queryString, null);
}
public void queryString(String queryString, ConditionOptionCall<QueryStringQueryBuilder> opLambda) {
QueryStringQueryBuilder queryStringQuery = QueryBuilders.queryStringQuery(queryString);
regQ(queryStringQuery);
if (opLambda != null) {
opLambda.callback(queryStringQuery);
}
}
public void matchAll() {
matchAll(null);
}
public void matchAll(ConditionOptionCall<MatchAllQueryBuilder> opLambda) {
MatchAllQueryBuilder builder = QueryBuilders.matchAllQuery();
regQ(builder);
if (opLambda != null) {
opLambda.callback(builder);
}
}
protected FilteredQueryBuilder regFilteredQ(QueryBuilder queryBuilder, FilterBuilder filterBuilder) {
return QueryBuilders.filteredQuery(queryBuilder, filterBuilder);
}
protected BoolQueryBuilder regBoolCQ(List<QueryBuilder> mustList, List<QueryBuilder> shouldList, List<QueryBuilder> mustNotList) {
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery();
mustList.forEach(query -> {
boolQuery.must(query);
});
shouldList.forEach(query -> {
boolQuery.should(query);
});
mustNotList.forEach(query -> {
boolQuery.mustNot(query);
});
return boolQuery;
}
protected TermQueryBuilder regTermQ(String name, Object value) {
TermQueryBuilder termQuery = QueryBuilders.termQuery(name, value);
regQ(termQuery);
return termQuery;
}
protected TermsQueryBuilder regTermsQ(String name, Collection<?> value) {
TermsQueryBuilder termsQuery = QueryBuilders.termsQuery(name, value);
regQ(termsQuery);
return termsQuery;
}
protected MatchQueryBuilder regMatchQ(String name, Object value) {
MatchQueryBuilder matchQuery = QueryBuilders.matchQuery(name, value);
regQ(matchQuery);
return matchQuery;
}
protected MatchQueryBuilder regMatchPhraseQ(String name, Object value) {
MatchQueryBuilder matchQuery = QueryBuilders.matchPhraseQuery(name, value);
regQ(matchQuery);
return matchQuery;
}
protected MatchQueryBuilder regMatchPhrasePrefixQ(String name, Object value) {
MatchQueryBuilder matchQuery = QueryBuilders.matchPhrasePrefixQuery(name, value);
regQ(matchQuery);
return matchQuery;
}
protected FuzzyQueryBuilder regFuzzyQ(String name, Object value) {
FuzzyQueryBuilder fuzzyQuery = QueryBuilders.fuzzyQuery(name, value);
regQ(fuzzyQuery);
return fuzzyQuery;
}
protected PrefixQueryBuilder regPrefixQ(String name, String prefix) {
PrefixQueryBuilder prefixQuery = QueryBuilders.prefixQuery(name, prefix);
regQ(prefixQuery);
return prefixQuery;
}
protected RangeQueryBuilder regRangeQ(String name, ConditionKey ck, Object value) {
for (QueryBuilder builder : queryBuilderList) {
if (builder instanceof RangeQueryBuilder) {
RangeQueryBuilder rangeQueryBuilder = (RangeQueryBuilder) builder;
if (rangeQueryBuilder.toString().replaceAll("\\s", "").startsWith("{\"range\":{\"" + name + "\"")) {
addRangeC(rangeQueryBuilder, ck, value);
return rangeQueryBuilder;
}
}
}
RangeQueryBuilder rangeQueryBuilder = QueryBuilders.rangeQuery(name);
addRangeC(rangeQueryBuilder, ck, value);
regQ(rangeQueryBuilder);
return rangeQueryBuilder;
}
protected void addRangeC(RangeQueryBuilder builder, ConditionKey ck, Object value) {
if (ck.equals(ConditionKey.CK_GREATER_THAN)) {
builder.gt(value);
} else if (ck.equals(ConditionKey.CK_GREATER_EQUAL)) {
builder.gte(value);
} else if (ck.equals(ConditionKey.CK_LESS_THAN)) {
builder.lt(value);
} else if (ck.equals(ConditionKey.CK_LESS_EQUAL)) {
builder.lte(value);
}
}
protected void regQ(QueryBuilder builder) {
if (queryBuilderList == null) {
queryBuilderList = new ArrayList<>();
}
queryBuilderList.add(builder);
}
protected void regOBA(String field) {
registerOrderBy(field, true);
}
protected void regOBD(String field) {
registerOrderBy(field, false);
}
protected void registerOrderBy(String field, boolean ascOrDesc) {
if (fieldSortBuilderList == null) {
fieldSortBuilderList = new ArrayList<>();
}
fieldSortBuilderList.add(SortBuilders.fieldSort(field).order(ascOrDesc ? SortOrder.ASC : SortOrder.DESC));
}
@Override
public ColumnRealName toColumnRealName(String columnDbName) {
return ColumnRealName.create(xgetAliasName(), toColumnSqlName(columnDbName));
}
@Override
public ColumnRealName toColumnRealName(ColumnInfo columnInfo) {
return ColumnRealName.create(xgetAliasName(), columnInfo.getColumnSqlName());
}
@Override
public ColumnSqlName toColumnSqlName(String columnDbName) {
return new ColumnSqlName(columnDbName);
}
@Override
public ConditionBean xgetBaseCB() {
return null;
}
@Override
public ConditionQuery xgetBaseQuery() {
return null;
}
@Override
public ConditionQuery xgetReferrerQuery() {
return null;
}
@Override
public SqlClause xgetSqlClause() {
return null;
}
@Override
public int xgetNestLevel() {
return 0;
}
@Override
public int xgetNextNestLevel() {
return 0;
}
@Override
public boolean isBaseQuery() {
return false;
}
@Override
public String xgetForeignPropertyName() {
// TODO
return null;
}
@Override
public String xgetRelationPath() {
// TODO
return null;
}
@Override
public String xgetLocationBase() {
final StringBuilder sb = new StringBuilder();
ConditionQuery query = this;
while (true) {
if (query.isBaseQuery()) {
sb.insert(0, CQ_PROPERTY + ".");
break;
} else {
final String foreignPropertyName = query.xgetForeignPropertyName();
if (foreignPropertyName == null) {
String msg = "The foreignPropertyName of the query should not be null:";
msg = msg + " query=" + query;
throw new IllegalStateException(msg);
}
sb.insert(0, CQ_PROPERTY + initCap(foreignPropertyName) + ".");
}
query = query.xgetReferrerQuery();
}
return sb.toString();
}
protected String initCap(String str) {
return Srl.initCap(str);
}
@Override
public ConditionValue invokeValue(String columnFlexibleName) {
return null;
}
@Override
public void invokeQuery(String columnFlexibleName, String conditionKeyName, Object conditionValue) {
// nothing
}
@Override
public void invokeQuery(String columnFlexibleName, String conditionKeyName, Object conditionValue, ConditionOption conditionOption) {
// nothing
}
@Override
public void invokeQueryEqual(String columnFlexibleName, Object conditionValue) {
// nothing
}
@Override
public void invokeQueryNotEqual(String columnFlexibleName, Object conditionValue) {
// nothing
}
@Override
public void invokeOrderBy(String columnFlexibleName, boolean isAsc) {
// nothing
}
@Override
public ConditionQuery invokeForeignCQ(String foreignPropertyName) {
// TODO
return null;
}
@Override
public boolean invokeHasForeignCQ(String foreignPropertyName) {
// TODO
return false;
}
@Override
public void xregisterParameterOption(ParameterOption option) {
// nothing
}
public class DocMetaCQ {
public void setId_Equal(String id) {
regQ(QueryBuilders.idsQuery(asTableDbName()).addIds(id));
}
}
@FunctionalInterface
public interface ConditionOptionCall<OP extends QueryBuilder> {
/**
* @param op The option of condition to be set up. (NotNull)
*/
void callback(OP op);
}
@FunctionalInterface
public interface BoolCall<CQ extends AbstractConditionQuery> {
void callback(CQ must, CQ should, CQ mustNot);
}
@FunctionalInterface
public interface FilteredCall<CQ extends AbstractConditionQuery, CF extends AbstractConditionFilter> {
void callback(CQ query, CF filter);
}
@FunctionalInterface
public interface OperatorCall<CQ extends AbstractConditionQuery> {
void callback(CQ and);
}
}

View file

@ -0,0 +1,221 @@
package ${request.package}.bsentity;
import java.io.Serializable;
import java.util.Map;
import java.util.Set;
import org.dbflute.Entity;
import org.dbflute.FunCustodial;
import org.dbflute.dbmeta.accessory.EntityModifiedProperties;
import org.dbflute.dbmeta.accessory.EntityUniqueDrivenProperties;
import org.dbflute.util.DfCollectionUtil;
import org.elasticsearch.action.delete.DeleteRequestBuilder;
import org.elasticsearch.action.index.IndexRequestBuilder;
/**
* @author FreeGen
*/
public abstract class AbstractEntity implements Entity, Serializable, Cloneable {
private static final long serialVersionUID = 1L;
protected DocMeta docMeta;
protected final EntityUniqueDrivenProperties __uniqueDrivenProperties = newUniqueDrivenProperties();
protected final EntityModifiedProperties __modifiedProperties = newModifiedProperties();
protected EntityModifiedProperties __specifiedProperties;
public DocMeta asDocMeta() {
if (docMeta == null) {
docMeta = new DocMeta();
}
return docMeta;
}
public Set<String> mymodifiedProperties() {
return __modifiedProperties.getPropertyNames();
}
public void mymodifyProperty(String propertyName) {
registerModifiedProperty(propertyName);
}
public void mymodifyPropertyCancel(String propertyName) {
__modifiedProperties.remove(propertyName);
}
public void clearModifiedInfo() {
__modifiedProperties.clear();
}
public boolean hasModification() {
return !__modifiedProperties.isEmpty();
}
protected EntityModifiedProperties newModifiedProperties() {
return new EntityModifiedProperties();
}
protected void registerModifiedProperty(String propertyName) {
__modifiedProperties.addPropertyName(propertyName);
registerSpecifiedProperty(propertyName); // synchronize if exists, basically for user's manual call
}
public void modifiedToSpecified() {
if (__modifiedProperties.isEmpty()) {
return; // basically no way when called in Framework (because called when SpecifyColumn exists)
}
__specifiedProperties = newModifiedProperties();
__specifiedProperties.accept(__modifiedProperties);
}
public Set<String> myspecifiedProperties() {
if (__specifiedProperties != null) {
return __specifiedProperties.getPropertyNames();
}
return DfCollectionUtil.emptySet();
}
public void myspecifyProperty(String propertyName) {
registerSpecifiedProperty(propertyName);
}
public void myspecifyPropertyCancel(String propertyName) {
if (__specifiedProperties != null) {
__specifiedProperties.remove(propertyName);
}
}
public void clearSpecifiedInfo() {
if (__specifiedProperties != null) {
__specifiedProperties.clear();
}
}
protected void checkSpecifiedProperty(String propertyName) {
FunCustodial.checkSpecifiedProperty(this, propertyName, __specifiedProperties);
}
protected void registerSpecifiedProperty(String propertyName) { // basically called by modified property registration
if (__specifiedProperties != null) { // normally false, true if e.g. setting after selected
__specifiedProperties.addPropertyName(propertyName);
}
}
@Override
public boolean hasPrimaryKeyValue() {
return asDocMeta().id() != null;
}
protected EntityUniqueDrivenProperties newUniqueDrivenProperties() {
return new EntityUniqueDrivenProperties();
}
@Override
public Set<String> myuniqueDrivenProperties() {
return __uniqueDrivenProperties.getPropertyNames();
}
@Override
public void myuniqueByProperty(String propertyName) {
__uniqueDrivenProperties.addPropertyName(propertyName);
}
@Override
public void myuniqueByPropertyCancel(String propertyName) {
__uniqueDrivenProperties.remove(propertyName);
}
@Override
public void clearUniqueDrivenInfo() {
__uniqueDrivenProperties.clear();
}
@Override
public void markAsSelect() {
// TODO Auto-generated method stub
}
@Override
public boolean createdBySelect() {
// TODO Auto-generated method stub
return false;
}
@Override
public int instanceHash() {
// TODO Auto-generated method stub
return 0;
}
@Override
public String toStringWithRelation() {
// TODO Auto-generated method stub
return null;
}
@Override
public String buildDisplayString(String name, boolean column, boolean relation) {
// TODO Auto-generated method stub
return null;
}
public abstract Map<String, Object> toSource();
public class DocMeta {
protected String id;
protected Long version;
private RequestOptionCall<IndexRequestBuilder> indexOption;
private RequestOptionCall<DeleteRequestBuilder> deleteOption;
public DocMeta id(String id) {
this.id = id;
myuniqueByProperty("_id");
return this;
}
public String id() {
return id;
}
public DocMeta version(Long version) {
this.version = version;
return this;
}
public Long version() {
return version;
}
public DocMeta indexOption(RequestOptionCall<IndexRequestBuilder> builder) {
this.indexOption = builder;
return this;
}
public RequestOptionCall<IndexRequestBuilder> indexOption() {
return indexOption;
}
public DocMeta deleteOption(RequestOptionCall<DeleteRequestBuilder> builder) {
this.deleteOption = builder;
return this;
}
public RequestOptionCall<DeleteRequestBuilder> deleteOption() {
return deleteOption;
}
}
@FunctionalInterface
public interface RequestOptionCall<OP> {
void callback(OP op);
}
}

View file

@ -0,0 +1,228 @@
package ${request.package}.bsbhv;
import java.util.List;
import java.util.Map;
import ${request.package}.bsentity.AbstractEntity;
import ${request.package}.bsentity.AbstractEntity.RequestOptionCall;
import ${request.package}.bsentity.dbmeta.${table.camelizedName}Dbm;
import ${request.package}.cbean.${table.camelizedName}CB;
import ${request.package}.exentity.${table.camelizedName};
import org.dbflute.Entity;
import org.dbflute.bhv.readable.CBCall;
import org.dbflute.bhv.readable.EntityRowHandler;
import org.dbflute.cbean.ConditionBean;
import org.dbflute.cbean.result.ListResultBean;
import org.dbflute.cbean.result.PagingResultBean;
import org.dbflute.exception.IllegalBehaviorStateException;
import org.dbflute.optional.OptionalEntity;
import org.elasticsearch.action.bulk.BulkRequestBuilder;
import org.elasticsearch.action.delete.DeleteRequestBuilder;
import org.elasticsearch.action.index.IndexRequestBuilder;
/**
* @author FreeGen
*/
public abstract class Bs${table.camelizedName}Bhv extends AbstractBehavior<${table.camelizedName}, ${table.camelizedName}CB> {
@Override
public String asTableDbName() {
return asEsIndexType();
}
@Override
protected String asEsIndex() {
return "${table.indexSettings.index}";
}
@Override
public String asEsIndexType() {
return "${table.name}";
}
@Override
public String asEsSearchType() {
return "${table.name}";
}
@Override
public ${table.camelizedName}Dbm asDBMeta() {
return ${table.camelizedName}Dbm.getInstance();
}
@Override
protected <RESULT extends ${table.camelizedName}> RESULT createEntity(Map<String, Object> source, Class<? extends RESULT> entityType) {
try {
final RESULT result = entityType.newInstance();
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#set ($javaNative = ${column.type})
result.set${column.capCamelName}(to$javaNative(source.get("${column.name}")));
#end
#end
return result;
} catch (InstantiationException | IllegalAccessException e) {
final String msg = "Cannot create a new instance: " + entityType.getName();
throw new IllegalBehaviorStateException(msg, e);
}
}
public int selectCount(CBCall<${table.camelizedName}CB> cbLambda) {
return facadeSelectCount(createCB(cbLambda));
}
public OptionalEntity<${table.camelizedName}> selectEntity(CBCall<${table.camelizedName}CB> cbLambda) {
return facadeSelectEntity(createCB(cbLambda));
}
protected OptionalEntity<${table.camelizedName}> facadeSelectEntity(${table.camelizedName}CB cb) {
return doSelectOptionalEntity(cb, typeOfSelectedEntity());
}
protected <ENTITY extends ${table.camelizedName}> OptionalEntity<ENTITY> doSelectOptionalEntity(${table.camelizedName}CB cb,
Class<? extends ENTITY> tp) {
return createOptionalEntity(doSelectEntity(cb, tp), cb);
}
@Override
public ${table.camelizedName}CB newConditionBean() {
return new ${table.camelizedName}CB();
}
@Override
protected Entity doReadEntity(ConditionBean cb) {
return facadeSelectEntity(downcast(cb)).orElse(null);
}
public ${table.camelizedName} selectEntityWithDeletedCheck(CBCall<${table.camelizedName}CB> cbLambda) {
return facadeSelectEntityWithDeletedCheck(createCB(cbLambda));
}
public OptionalEntity<${table.camelizedName}> selectByPK(String id) {
return facadeSelectByPK(id);
}
protected OptionalEntity<${table.camelizedName}> facadeSelectByPK(String id) {
return doSelectOptionalByPK(id, typeOfSelectedEntity());
}
protected <ENTITY extends ${table.camelizedName}> ENTITY doSelectByPK(String id, Class<? extends ENTITY> tp) {
return doSelectEntity(xprepareCBAsPK(id), tp);
}
protected ${table.camelizedName}CB xprepareCBAsPK(String id) {
assertObjectNotNull("id", id);
return newConditionBean().acceptPK(id);
}
protected <ENTITY extends ${table.camelizedName}> OptionalEntity<ENTITY> doSelectOptionalByPK(String id, Class<? extends ENTITY> tp) {
return createOptionalEntity(doSelectByPK(id, tp), id);
}
@Override
protected Class<? extends ${table.camelizedName}> typeOfSelectedEntity() {
return ${table.camelizedName}.class;
}
@Override
protected Class<${table.camelizedName}> typeOfHandlingEntity() {
return ${table.camelizedName}.class;
}
@Override
protected Class<${table.camelizedName}CB> typeOfHandlingConditionBean() {
return ${table.camelizedName}CB.class;
}
public ListResultBean<${table.camelizedName}> selectList(CBCall<${table.camelizedName}CB> cbLambda) {
return facadeSelectList(createCB(cbLambda));
}
public PagingResultBean<${table.camelizedName}> selectPage(CBCall<${table.camelizedName}CB> cbLambda) {
// TODO same?
return (PagingResultBean<${table.camelizedName}>) facadeSelectList(createCB(cbLambda));
}
public void selectCursor(CBCall<${table.camelizedName}CB> cbLambda, EntityRowHandler<${table.camelizedName}> entityLambda) {
facadeSelectCursor(createCB(cbLambda), entityLambda);
}
public void selectBulk(CBCall<${table.camelizedName}CB> cbLambda, EntityRowHandler<List<${table.camelizedName}>> entityLambda) {
delegateSelectBulk(createCB(cbLambda), entityLambda,typeOfSelectedEntity());
}
public void insert(${table.camelizedName} entity) {
doInsert(entity, null);
}
public void insert(${table.camelizedName} entity, RequestOptionCall<IndexRequestBuilder> opLambda) {
if (entity instanceof AbstractEntity) {
entity.asDocMeta().indexOption(opLambda);
}
doInsert(entity, null);
}
public void update(${table.camelizedName} entity) {
doUpdate(entity, null);
}
public void update(${table.camelizedName} entity, RequestOptionCall<IndexRequestBuilder> opLambda) {
if (entity instanceof AbstractEntity) {
entity.asDocMeta().indexOption(opLambda);
}
doUpdate(entity, null);
}
public void insertOrUpdate(${table.camelizedName} entity) {
doInsertOrUpdate(entity, null, null);
}
public void insertOrUpdate(${table.camelizedName} entity, RequestOptionCall<IndexRequestBuilder> opLambda) {
if (entity instanceof AbstractEntity) {
entity.asDocMeta().indexOption(opLambda);
}
doInsertOrUpdate(entity, null, null);
}
public void delete(${table.camelizedName} entity) {
doDelete(entity, null);
}
public void delete(${table.camelizedName} entity, RequestOptionCall<DeleteRequestBuilder> opLambda) {
if (entity instanceof AbstractEntity) {
entity.asDocMeta().deleteOption(opLambda);
}
doDelete(entity, null);
}
public int queryDelete(CBCall<${table.camelizedName}CB> cbLambda) {
return doQueryDelete(createCB(cbLambda), null);
}
public int[] batchInsert(List<${table.camelizedName}> list) {
return batchInsert(list, null);
}
public int[] batchInsert(List<${table.camelizedName}> list, RequestOptionCall<BulkRequestBuilder> call) {
return doBatchInsert(new BulkList<>(list, call), null);
}
public int[] batchUpdate(List<${table.camelizedName}> list) {
return batchUpdate(list, null);
}
public int[] batchUpdate(List<${table.camelizedName}> list, RequestOptionCall<BulkRequestBuilder> call) {
return doBatchUpdate(new BulkList<>(list, call), null);
}
public int[] batchDelete(List<${table.camelizedName}> list) {
return batchDelete(list, null);
}
public int[] batchDelete(List<${table.camelizedName}> list, RequestOptionCall<BulkRequestBuilder> call) {
return doBatchDelete(new BulkList<>(list, call), null);
}
// TODO create, modify, remove
}

View file

@ -0,0 +1,134 @@
package ${request.package}.cbean.bs;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import ${request.package}.bsentity.dbmeta.${table.camelizedName}Dbm;
import ${request.package}.cbean.${table.camelizedName}CB;
import ${request.package}.cbean.cq.${table.camelizedName}CQ;
import ${request.package}.cbean.cq.bs.Bs${table.camelizedName}CQ;
import org.dbflute.cbean.ConditionQuery;
import org.elasticsearch.action.count.CountRequestBuilder;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.index.query.QueryBuilder;
/**
* @author FreeGen
*/
public class Bs${table.camelizedName}CB extends AbstractConditionBean {
protected Bs${table.camelizedName}CQ _conditionQuery;
protected HpSpecification _specification;
@Override
public ${table.camelizedName}Dbm asDBMeta() {
return ${table.camelizedName}Dbm.getInstance();
}
@Override
public String asTableDbName() {
return "${table.name}";
}
@Override
public boolean hasSpecifiedColumn() {
return _specification != null;
}
@Override
public ConditionQuery localCQ() {
return doGetConditionQuery();
}
public ${table.camelizedName}CB acceptPK(String id) {
assertObjectNotNull("id", id);
Bs${table.camelizedName}CB cb = this;
cb.query().docMeta().setId_Equal(id);
return (${table.camelizedName}CB) this;
}
@Override
public void acceptPrimaryKeyMap(Map<String, ? extends Object> primaryKeyMap) {
acceptPK((String)primaryKeyMap.get("_id"));
}
@Override
public CountRequestBuilder build(CountRequestBuilder builder) {
if (_conditionQuery != null) {
QueryBuilder queryBuilder = _conditionQuery.getQuery();
if (queryBuilder != null) {
builder.setQuery(queryBuilder);
}
}
return builder;
}
@Override
public SearchRequestBuilder build(SearchRequestBuilder builder) {
if (_conditionQuery != null) {
QueryBuilder queryBuilder = _conditionQuery.getQuery();
if (queryBuilder != null) {
builder.setQuery(queryBuilder);
}
_conditionQuery.getFieldSortBuilderList().forEach(sort -> {
builder.addSort(sort);
});
}
if (_specification != null) {
builder.setFetchSource(_specification.columnList.toArray(new String[_specification.columnList.size()]), null);
}
return builder;
}
public Bs${table.camelizedName}CQ query() {
assertQueryPurpose();
return doGetConditionQuery();
}
protected Bs${table.camelizedName}CQ doGetConditionQuery() {
if (_conditionQuery == null) {
_conditionQuery = createLocalCQ();
}
return _conditionQuery;
}
protected Bs${table.camelizedName}CQ createLocalCQ() {
return new ${table.camelizedName}CQ();
}
public HpSpecification specify() {
assertSpecifyPurpose();
if (_specification == null) {
_specification = new HpSpecification();
}
return _specification;
}
protected void assertQueryPurpose() {
// TODO
}
protected void assertSpecifyPurpose() {
// TODO
}
public static class HpSpecification {
private List<String> columnList = new ArrayList<>();
private void doColumn(String name) {
columnList.add(name);
}
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
public void column${column.capCamelName}() {
doColumn("${column.name}");
}
#end
#end
}
}

View file

@ -0,0 +1,244 @@
package ${request.package}.cbean.cf.bs;
import java.util.Collection;
import ${request.package}.cbean.cf.${table.camelizedName}CF;
import ${request.package}.cbean.cq.${table.camelizedName}CQ;
import org.dbflute.exception.IllegalConditionBeanOperationException;
import org.dbflute.cbean.ckey.ConditionKey;
import org.elasticsearch.index.query.AndFilterBuilder;
import org.elasticsearch.index.query.BoolFilterBuilder;
import org.elasticsearch.index.query.ExistsFilterBuilder;
import org.elasticsearch.index.query.MissingFilterBuilder;
import org.elasticsearch.index.query.NotFilterBuilder;
import org.elasticsearch.index.query.OrFilterBuilder;
import org.elasticsearch.index.query.PrefixFilterBuilder;
import org.elasticsearch.index.query.QueryFilterBuilder;
import org.elasticsearch.index.query.RangeFilterBuilder;
import org.elasticsearch.index.query.TermFilterBuilder;
import org.elasticsearch.index.query.TermsFilterBuilder;
/**
* @author FreeGen
*/
public abstract class Bs${table.camelizedName}CF extends AbstractConditionFilter {
public void bool(BoolCall<${table.camelizedName}CF> boolLambda) {
bool(boolLambda, null);
}
public void bool(BoolCall<${table.camelizedName}CF> boolLambda, ConditionOptionCall<BoolFilterBuilder> opLambda) {
${table.camelizedName}CF mustFilter = new ${table.camelizedName}CF();
${table.camelizedName}CF shouldFilter = new ${table.camelizedName}CF();
${table.camelizedName}CF mustNotFilter = new ${table.camelizedName}CF();
boolLambda.callback(mustFilter, shouldFilter, mustNotFilter);
if (mustFilter.hasFilters() || shouldFilter.hasFilters() || mustNotFilter.hasFilters()) {
BoolFilterBuilder builder =
regBoolF(mustFilter.filterBuilderList, shouldFilter.filterBuilderList, mustNotFilter.filterBuilderList);
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
public void and(OperatorCall<${table.camelizedName}CF> andLambda) {
and(andLambda, null);
}
public void and(OperatorCall<${table.camelizedName}CF> andLambda, ConditionOptionCall<AndFilterBuilder> opLambda) {
${table.camelizedName}CF andFilter = new ${table.camelizedName}CF();
andLambda.callback(andFilter);
if (andFilter.hasFilters()) {
AndFilterBuilder builder = regAndF(andFilter.filterBuilderList);
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
public void or(OperatorCall<${table.camelizedName}CF> orLambda) {
or(orLambda, null);
}
public void or(OperatorCall<${table.camelizedName}CF> orLambda, ConditionOptionCall<OrFilterBuilder> opLambda) {
${table.camelizedName}CF orFilter = new ${table.camelizedName}CF();
orLambda.callback(orFilter);
if (orFilter.hasFilters()) {
OrFilterBuilder builder = regOrF(orFilter.filterBuilderList);
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
public void not(OperatorCall<${table.camelizedName}CF> notLambda) {
not(notLambda, null);
}
public void not(OperatorCall<${table.camelizedName}CF> notLambda, ConditionOptionCall<NotFilterBuilder> opLambda) {
${table.camelizedName}CF notFilter = new ${table.camelizedName}CF();
notLambda.callback(notFilter);
if (notFilter.hasFilters()) {
if (notFilter.filterBuilderList.size() > 1) {
final String msg = "not filter must be one filter.";
throw new IllegalConditionBeanOperationException(msg);
}
NotFilterBuilder builder = regNotF(notFilter.filterBuilderList.get(0));
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
public void query(org.codelibs.fess.es.cbean.cq.bs.AbstractConditionQuery.OperatorCall<${table.camelizedName}CQ> queryLambda) {
query(queryLambda, null);
}
public void query(org.codelibs.fess.es.cbean.cq.bs.AbstractConditionQuery.OperatorCall<${table.camelizedName}CQ> queryLambda,
ConditionOptionCall<QueryFilterBuilder> opLambda) {
${table.camelizedName}CQ query = new ${table.camelizedName}CQ();
queryLambda.callback(query);
if (query.hasQueries()) {
QueryFilterBuilder builder = regQueryF(query.getQuery());
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#set ($javaNative = ${column.type})
public void set${column.capCamelName}_NotEqual($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_NotEqual(${column.uncapCamelName}, null, null);
}
public void set${column.capCamelName}_NotEqual($javaNative ${column.uncapCamelName}, ConditionOptionCall<NotFilterBuilder> notOpLambda,
ConditionOptionCall<TermFilterBuilder> eqOpLambda) {
not(subCf -> {
subCf.set${column.capCamelName}_Equal(${column.uncapCamelName}, eqOpLambda);
} , notOpLambda);
}
public void set${column.capCamelName}_Equal($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Term(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Equal($javaNative ${column.uncapCamelName}, ConditionOptionCall<TermFilterBuilder> opLambda) {
set${column.capCamelName}_Term(${column.uncapCamelName}, opLambda);
}
public void set${column.capCamelName}_Term($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Term(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Term($javaNative ${column.uncapCamelName}, ConditionOptionCall<TermFilterBuilder> opLambda) {
TermFilterBuilder builder = regTermF("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_Terms(Collection<$javaNative> ${column.uncapCamelName}List) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, null);
}
public void set${column.capCamelName}_Terms(Collection<$javaNative> ${column.uncapCamelName}List, ConditionOptionCall<TermsFilterBuilder> opLambda) {
TermsFilterBuilder builder = regTermsF("${column.name}", ${column.uncapCamelName}List);
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_InScope(Collection<$javaNative> ${column.uncapCamelName}List) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, null);
}
public void set${column.capCamelName}_InScope(Collection<$javaNative> ${column.uncapCamelName}List, ConditionOptionCall<TermsFilterBuilder> opLambda) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, opLambda);
}
#if ($javaNative == "String")
public void set${column.capCamelName}_Prefix($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Prefix(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Prefix($javaNative ${column.uncapCamelName}, ConditionOptionCall<PrefixFilterBuilder> opLambda) {
PrefixFilterBuilder builder = regPrefixF("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
#end
public void set${column.capCamelName}_Exists() {
set${column.capCamelName}_Exists(null);
}
public void set${column.capCamelName}_Exists(ConditionOptionCall<ExistsFilterBuilder> opLambda) {
ExistsFilterBuilder builder = regExistsF("${column.name}");
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_Missing() {
set${column.capCamelName}_Missing(null);
}
public void set${column.capCamelName}_Missing(ConditionOptionCall<MissingFilterBuilder> opLambda) {
MissingFilterBuilder builder = regMissingF("${column.name}");
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_GreaterThan($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_GreaterThan(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_GreaterThan($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeFilterBuilder> opLambda) {
RangeFilterBuilder builder = regRangeF("${column.name}", ConditionKey.CK_GREATER_THAN, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_LessThan($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_LessThan(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_LessThan($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeFilterBuilder> opLambda) {
RangeFilterBuilder builder = regRangeF("${column.name}", ConditionKey.CK_LESS_THAN, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_GreaterEqual($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_GreaterEqual(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_GreaterEqual($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeFilterBuilder> opLambda) {
RangeFilterBuilder builder = regRangeF("${column.name}", ConditionKey.CK_GREATER_EQUAL, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_LessEqual($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_LessEqual(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_LessEqual($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeFilterBuilder> opLambda) {
RangeFilterBuilder builder = regRangeF("${column.name}", ConditionKey.CK_LESS_EQUAL, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
#end
#end
}

View file

@ -0,0 +1,220 @@
package ${request.package}.cbean.cq.bs;
import java.util.Collection;
import ${request.package}.cbean.cq.${table.camelizedName}CQ;
import ${request.package}.cbean.cf.${table.camelizedName}CF;
import org.dbflute.cbean.ckey.ConditionKey;
import org.elasticsearch.index.query.BoolQueryBuilder;
import org.elasticsearch.index.query.FilteredQueryBuilder;
import org.elasticsearch.index.query.FuzzyQueryBuilder;
import org.elasticsearch.index.query.MatchQueryBuilder;
import org.elasticsearch.index.query.PrefixQueryBuilder;
import org.elasticsearch.index.query.RangeQueryBuilder;
import org.elasticsearch.index.query.TermQueryBuilder;
import org.elasticsearch.index.query.TermsQueryBuilder;
/**
* @author FreeGen
*/
public abstract class Bs${table.camelizedName}CQ extends AbstractConditionQuery {
@Override
public String asTableDbName() {
return "${table.name}";
}
@Override
public String xgetAliasName() {
return "${table.name}";
}
public void filtered(FilteredCall<${table.camelizedName}CQ, ${table.camelizedName}CF> filteredLambda) {
filtered(filteredLambda, null);
}
public void filtered(FilteredCall<${table.camelizedName}CQ, ${table.camelizedName}CF> filteredLambda,
ConditionOptionCall<FilteredQueryBuilder> opLambda) {
${table.camelizedName}CQ query = new ${table.camelizedName}CQ();
${table.camelizedName}CF filter = new ${table.camelizedName}CF();
filteredLambda.callback(query, filter);
if (query.hasQueries()) {
FilteredQueryBuilder builder = regFilteredQ(query.getQuery(), filter.getFilter());
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
public void bool(BoolCall<${table.camelizedName}CQ> boolLambda) {
bool(boolLambda, null);
}
public void bool(BoolCall<${table.camelizedName}CQ> boolLambda, ConditionOptionCall<BoolQueryBuilder> opLambda) {
${table.camelizedName}CQ mustQuery = new ${table.camelizedName}CQ();
${table.camelizedName}CQ shouldQuery = new ${table.camelizedName}CQ();
${table.camelizedName}CQ mustNotQuery = new ${table.camelizedName}CQ();
boolLambda.callback(mustQuery, shouldQuery, mustNotQuery);
if (mustQuery.hasQueries() || shouldQuery.hasQueries() || mustNotQuery.hasQueries()) {
BoolQueryBuilder builder = regBoolCQ(mustQuery.queryBuilderList, shouldQuery.queryBuilderList, mustNotQuery.queryBuilderList);
if (opLambda != null) {
opLambda.callback(builder);
}
}
}
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#set ($javaNative = ${column.type})
public void set${column.capCamelName}_Equal($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Term(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Equal($javaNative ${column.uncapCamelName}, ConditionOptionCall<TermQueryBuilder> opLambda) {
set${column.capCamelName}_Term(${column.uncapCamelName}, opLambda);
}
public void set${column.capCamelName}_Term($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Term(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Term($javaNative ${column.uncapCamelName}, ConditionOptionCall<TermQueryBuilder> opLambda) {
TermQueryBuilder builder = regTermQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_Terms(Collection<$javaNative> ${column.uncapCamelName}List) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, null);
}
public void set${column.capCamelName}_Terms(Collection<$javaNative> ${column.uncapCamelName}List, ConditionOptionCall<TermsQueryBuilder> opLambda) {
TermsQueryBuilder builder = regTermsQ("${column.name}", ${column.uncapCamelName}List);
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_InScope(Collection<$javaNative> ${column.uncapCamelName}List) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, null);
}
public void set${column.capCamelName}_InScope(Collection<$javaNative> ${column.uncapCamelName}List, ConditionOptionCall<TermsQueryBuilder> opLambda) {
set${column.capCamelName}_Terms(${column.uncapCamelName}List, opLambda);
}
public void set${column.capCamelName}_Match($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Match(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Match($javaNative ${column.uncapCamelName}, ConditionOptionCall<MatchQueryBuilder> opLambda) {
MatchQueryBuilder builder = regMatchQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_MatchPhrase($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_MatchPhrase(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_MatchPhrase($javaNative ${column.uncapCamelName}, ConditionOptionCall<MatchQueryBuilder> opLambda) {
MatchQueryBuilder builder = regMatchPhraseQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_MatchPhrasePrefix($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_MatchPhrasePrefix(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_MatchPhrasePrefix($javaNative ${column.uncapCamelName}, ConditionOptionCall<MatchQueryBuilder> opLambda) {
MatchQueryBuilder builder = regMatchPhrasePrefixQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_Fuzzy($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Fuzzy(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Fuzzy($javaNative ${column.uncapCamelName}, ConditionOptionCall<FuzzyQueryBuilder> opLambda) {
FuzzyQueryBuilder builder = regFuzzyQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
#if ($javaNative == "String")
public void set${column.capCamelName}_Prefix($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_Prefix(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_Prefix($javaNative ${column.uncapCamelName}, ConditionOptionCall<PrefixQueryBuilder> opLambda) {
PrefixQueryBuilder builder = regPrefixQ("${column.name}", ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
#end
public void set${column.capCamelName}_GreaterThan($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_GreaterThan(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_GreaterThan($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeQueryBuilder> opLambda) {
RangeQueryBuilder builder = regRangeQ("${column.name}", ConditionKey.CK_GREATER_THAN, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_LessThan($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_LessThan(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_LessThan($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeQueryBuilder> opLambda) {
RangeQueryBuilder builder = regRangeQ("${column.name}", ConditionKey.CK_LESS_THAN, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_GreaterEqual($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_GreaterEqual(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_GreaterEqual($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeQueryBuilder> opLambda) {
RangeQueryBuilder builder = regRangeQ("${column.name}", ConditionKey.CK_GREATER_EQUAL, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public void set${column.capCamelName}_LessEqual($javaNative ${column.uncapCamelName}) {
set${column.capCamelName}_LessEqual(${column.uncapCamelName}, null);
}
public void set${column.capCamelName}_LessEqual($javaNative ${column.uncapCamelName}, ConditionOptionCall<RangeQueryBuilder> opLambda) {
RangeQueryBuilder builder = regRangeQ("${column.name}", ConditionKey.CK_LESS_EQUAL, ${column.uncapCamelName});
if (opLambda != null) {
opLambda.callback(builder);
}
}
public Bs${table.camelizedName}CQ addOrderBy_${column.capCamelName}_Asc() {
regOBA("${column.name}");
return this;
}
public Bs${table.camelizedName}CQ addOrderBy_${column.capCamelName}_Desc() {
regOBD("${column.name}");
return this;
}
#end
#end
}

View file

@ -0,0 +1,112 @@
package ${request.package}.bsentity;
#if ($table.get("_id") && $table.get("_id").get("path"))
#set ($idColumn = $table.get("_id").get("path"))
#end
import java.time.LocalDateTime;
import java.util.HashMap;
import java.util.Map;
import ${request.package}.bsentity.dbmeta.${table.camelizedName}Dbm;
#if ($table.hasRefColumn)
import ${request.package}.exentity.*;
#end
/**
* ${table.comment}
* @author FreeGen
*/
public class Bs${table.camelizedName} extends AbstractEntity {
private static final long serialVersionUID = 1L;
@Override
public ${table.camelizedName}Dbm asDBMeta() {
return ${table.camelizedName}Dbm.getInstance();
}
@Override
public String asTableDbName() {
return "${table.name}";
}
// ===================================================================================
// Attribute
// =========
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#set ($javaNative = ${column.type})
#elseif ($column.isRefColumn)
#set ($javaNative = ${column.camelizedName})
#end
#if ($column.name != $idColumn)
/** ${column.name} */
protected ${javaNative} ${column.uncapCamelName};
#end
#end
// [Referrers] *comment only
#foreach ($referrer in $table.referrerList)
// o ${referrer.name}
#end
// ===================================================================================
// Accessor
// ========
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#set ($javaNative = ${column.type})
#elseif ($column.isRefColumn)
#set ($javaNative = ${column.camelizedName})
#end
#if ($column.name != $idColumn)
#if ($javaNative == "boolean")
public ${javaNative} is${column.capCamelName}() {
checkSpecifiedProperty("${column.uncapCamelName}");
return ${column.uncapCamelName};
}
#else
public ${javaNative} get${column.capCamelName}() {
checkSpecifiedProperty("${column.uncapCamelName}");
return ${column.uncapCamelName};
}
#end
public void set${column.capCamelName}(${javaNative} value) {
registerModifiedProperty("${column.uncapCamelName}");
this.${column.uncapCamelName} = value;
}
#else
public ${javaNative} get${column.capCamelName}() {
checkSpecifiedProperty("${column.uncapCamelName}");
return asDocMeta().id();
}
public void set${column.capCamelName}(${javaNative} value) {
registerModifiedProperty("${column.uncapCamelName}");
asDocMeta().id(value);
}
#end
#end
@Override
public Map<String, Object> toSource() {
Map<String, Object> sourceMap = new HashMap<>();
#foreach ($column in $table.columnList)
#if ($column.isNormalColumn)
#if ($column.name != $idColumn)
if (${column.uncapCamelName} != null) {
sourceMap.put("${column.name}", ${column.uncapCamelName});
}
#else
if (asDocMeta().id() != null) {
sourceMap.put("${column.name}", asDocMeta().id());
}
#end
#end
#end
return sourceMap;
}
}

View file

@ -0,0 +1,154 @@
package ${request.package}.bsentity.dbmeta;
import java.util.List;
import java.util.Map;
import org.dbflute.Entity;
import org.dbflute.dbmeta.AbstractDBMeta;
import org.dbflute.dbmeta.info.ColumnInfo;
import org.dbflute.dbmeta.info.UniqueInfo;
import org.dbflute.dbmeta.name.TableSqlName;
import org.dbflute.dbway.DBDef;
public class ${table.camelizedName}Dbm extends AbstractDBMeta {
// ===================================================================================
// Singleton
// =========
private static final ${table.camelizedName}Dbm _instance = new ${table.camelizedName}Dbm();
private ${table.camelizedName}Dbm() {
}
public static ${table.camelizedName}Dbm getInstance() {
return _instance;
}
@Override
public String getProjectName() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getProjectPrefix() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getGenerationGapBasePrefix() {
// TODO Auto-generated method stub
return null;
}
@Override
public DBDef getCurrentDBDef() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getTableDbName() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getTableDispName() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getTablePropertyName() {
// TODO Auto-generated method stub
return null;
}
@Override
public TableSqlName getTableSqlName() {
// TODO Auto-generated method stub
return null;
}
@Override
public boolean hasPrimaryKey() {
// TODO Auto-generated method stub
return false;
}
@Override
public boolean hasCompoundPrimaryKey() {
// TODO Auto-generated method stub
return false;
}
@Override
public String getEntityTypeName() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getConditionBeanTypeName() {
// TODO Auto-generated method stub
return null;
}
@Override
public String getBehaviorTypeName() {
// TODO Auto-generated method stub
return null;
}
@Override
public Class<? extends Entity> getEntityType() {
// TODO Auto-generated method stub
return null;
}
@Override
public Entity newEntity() {
// TODO Auto-generated method stub
return null;
}
@Override
public void acceptPrimaryKeyMap(Entity entity, Map<String, ? extends Object> primaryKeyMap) {
// TODO Auto-generated method stub
}
@Override
public void acceptAllColumnMap(Entity entity, Map<String, ? extends Object> allColumnMap) {
// TODO Auto-generated method stub
}
@Override
public Map<String, Object> extractPrimaryKeyMap(Entity entity) {
// TODO Auto-generated method stub
return null;
}
@Override
public Map<String, Object> extractAllColumnMap(Entity entity) {
// TODO Auto-generated method stub
return null;
}
@Override
protected List<ColumnInfo> ccil() {
// TODO Auto-generated method stub
return null;
}
@Override
protected UniqueInfo cpui() {
// TODO Auto-generated method stub
return null;
}
}

View file

@ -0,0 +1,50 @@
package ${request.package}.cbean.result;
import org.dbflute.cbean.result.PagingResultBean;
public class EsPagingResultBean<ENTITY> extends PagingResultBean<ENTITY> {
private static final long serialVersionUID = 1L;
protected long took;
private int totalShards;
private int successfulShards;
private int failedShards;
public long getTook() {
return took;
}
public void setTook(long took) {
this.took = took;
}
public int getTotalShards() {
return totalShards;
}
public void setTotalShards(int totalShards) {
this.totalShards = totalShards;
}
public int getSuccessfulShards() {
return successfulShards;
}
public void setSuccessfulShards(int successfulShards) {
this.successfulShards = successfulShards;
}
public int getFailedShards() {
return failedShards;
}
public void setFailedShards(int failedShards) {
this.failedShards = failedShards;
}
}

View file

@ -0,0 +1,10 @@
package ${request.package}.exbhv;
import ${request.package}.bsbhv.Bs${table.camelizedName}Bhv;
/**
* @author FreeGen
*/
public class ${table.camelizedName}Bhv extends Bs${table.camelizedName}Bhv {
}

View file

@ -0,0 +1,9 @@
package ${request.package}.cbean;
import ${request.package}.cbean.bs.Bs${table.camelizedName}CB;
/**
* @author FreeGen
*/
public class ${table.camelizedName}CB extends Bs${table.camelizedName}CB {
}

View file

@ -0,0 +1,9 @@
package ${request.package}.cbean.cf;
import ${request.package}.cbean.cf.bs.Bs${table.camelizedName}CF;
/**
* @author FreeGen
*/
public class ${table.camelizedName}CF extends Bs${table.camelizedName}CF {
}

View file

@ -0,0 +1,9 @@
package ${request.package}.cbean.cq;
import ${request.package}.cbean.cq.bs.Bs${table.camelizedName}CQ;
/**
* @author FreeGen
*/
public class ${table.camelizedName}CQ extends Bs${table.camelizedName}CQ {
}

View file

@ -0,0 +1,11 @@
package ${request.package}.exentity;
import ${request.package}.bsentity.Bs${table.camelizedName};
/**
* @author FreeGen
*/
public class ${table.camelizedName} extends Bs${table.camelizedName} {
private static final long serialVersionUID = 1L;
}

View file

@ -0,0 +1,69 @@
package ${request.package}.cbean.sqlclause;
import org.dbflute.cbean.sqlclause.AbstractSqlClause;
import org.dbflute.dbway.DBWay;
public class SqlClauseEs extends AbstractSqlClause {
private static final long serialVersionUID = 1L;
public SqlClauseEs(String tableDbName) {
super(tableDbName);
}
@Override
public void lockForUpdate() {
// TODO Auto-generated method stub
}
@Override
public DBWay dbway() {
// TODO Auto-generated method stub
return null;
}
@Override
protected void doFetchFirst() {
// TODO Auto-generated method stub
}
@Override
protected void doFetchPage() {
// TODO Auto-generated method stub
}
@Override
protected void doClearFetchPageClause() {
// TODO Auto-generated method stub
}
@Override
protected String createSelectHint() {
// TODO Auto-generated method stub
return null;
}
@Override
protected String createFromBaseTableHint() {
// TODO Auto-generated method stub
return null;
}
@Override
protected String createFromHint() {
// TODO Auto-generated method stub
return null;
}
@Override
protected String createSqlSuffix() {
// TODO Auto-generated method stub
return null;
}
}

View file

@ -0,0 +1,4 @@
Directory for log files of DBFlute tasks
If your execution of DBFlute task fails,
look the log file "dbflute.log" for debug.

18
dbflute_fess/manage.bat Normal file
View file

@ -0,0 +1,18 @@
@echo off
setlocal
%~d0
cd %~p0
call _project.bat
:: tilde to remove double quotation
set FIRST_ARG=%~1
if "%FIRST_ARG%"=="" set FIRST_ARG=""
set SECOND_ARG=%2
if "%SECOND_ARG%"=="" set SECOND_ARG=""
call %DBFLUTE_HOME%\etc\cmd\_df-manage.cmd %MY_PROPERTIES_PATH% "%FIRST_ARG%" %SECOND_ARG%
if "%pause_at_end%"=="y" (
pause
)

14
dbflute_fess/manage.sh Normal file
View file

@ -0,0 +1,14 @@
#!/bin/bash
cd `dirname $0`
. ./_project.sh
FIRST_ARG=$1
SECOND_ARG=$2
sh $DBFLUTE_HOME/etc/cmd/_df-manage.sh $MY_PROPERTIES_PATH $FIRST_ARG $SECOND_ARG
taskReturnCode=$?
if [ $taskReturnCode -ne 0 ];then
exit $taskReturnCode;
fi

View file

@ -0,0 +1,3 @@
Directory for auto-generated documents
e.g. SchemaHTML, HistoryHTML

View file

@ -0,0 +1,50 @@
Directory for ReplaceSchema task
replace-schema-*.sql:
DDL statements for creation of your schema.
You should write your own DDL statements in this file.
(A SQL separator is semicolon ";")
take-finally-*.sql:
SQL statements for check loaded data (or DDL after data loading)
You should write your own SQL statements in this file.
(basically same specifications as replace-schema.sql)
The "data" directory is for data loading like this:
/- - - - - - - - - - - - - - - - - - - -
playsql
|-data
|-common
| |-xls
| |-10-master.xls
| |-defaultValueMap.dataprop
|-ut
|-xls
|-20-member.xls
|-30-product.xls
|-defaultValueMap.dataprop
- - - - - - - - - -/
The format of a xls file is like this:
/- - - - - - - - - - - - - - - - - - - -
|MEMBER_ID|MEMBER_NAME|BIRTHDATE |
| 1|Stojkovic |1965/03/03|
| 2|Savicevic | |
| 3|... |... |
(Sheet)
MEMBER / MEMBER_LOGIN / MEMBER_SECURITY
- - - - - - - - - -/
The defaultValueMap.dataprop is for common columns like this:
/- - - - - - - - - - - - - - - - - - - -
map:{
; REGISTER_DATETIME = sysdate
; REGISTER_USER = foo
; REGISTER_PROCESS = bar
; UPDATE_DATETIME = sysdate
; UPDATE_USER = foo
; UPDATE_PROCESS = bar
; VERSION_NO = 0
}
- - - - - - - - - -/

View file

@ -0,0 +1 @@

View file

@ -0,0 +1 @@

View file

@ -0,0 +1,4 @@
Directory for files of schema info
Files are auto-generated by DBFlute tasks.
Basically you don't need to touch this directory.

54
pom.xml
View file

@ -33,29 +33,6 @@
<artifactId>oss-parent</artifactId>
<version>9</version>
</parent>
<profiles>
<profile>
<id>h2</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<database>h2</database>
<databaseGroupId>com.h2database</databaseGroupId>
<databaseArtifactId>h2</databaseArtifactId>
<databaseVersion>1.4.181</databaseVersion>
</properties>
</profile>
<profile>
<id>mysql</id>
<properties>
<database>mysql</database>
<databaseGroupId>mysql</databaseGroupId>
<databaseArtifactId>mysql-connector-java</databaseArtifactId>
<databaseVersion>5.1.32</databaseVersion>
</properties>
</profile>
</profiles>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<dbflute.version>1.1.0-sp1</dbflute.version>
@ -66,7 +43,7 @@
<pdfbox.version>1.8.7</pdfbox.version>
<saflute.version>1.0.0-SNAPSHOT</saflute.version>
<elasticsearch.version>1.6.0</elasticsearch.version>
<cluster.runner.version>1.6.0.0-SNAPSHOT</cluster.runner.version>
<cluster.runner.version>1.6.0.0</cluster.runner.version>
<!-- Tomcat -->
<tomcat.delegate>true</tomcat.delegate>
<tomcat.useSeparateTomcatClassLoader>true</tomcat.useSeparateTomcatClassLoader>
@ -76,9 +53,6 @@
<build>
<finalName>fess</finalName>
<resources>
<resource>
<directory>src/main/${database}/resources</directory>
</resource>
<resource>
<directory>src/main/resources</directory>
</resource>
@ -114,9 +88,6 @@
<resource>
<directory>${project.build.directory}/${project.build.finalName}-compress</directory>
</resource>
<resource>
<directory>${basedir}/src/main/${database}/webapp</directory>
</resource>
</webResources>
<warSourceExcludes>WEB-INF/classes/**/*.*,WEB-INF/lib/*.jar</warSourceExcludes>
</configuration>
@ -203,6 +174,17 @@
<artifactId>tomcat8-maven-plugin</artifactId>
<version>3.0-SNAPSHOT</version>
</plugin>
<plugin>
<groupId>org.dbflute</groupId>
<artifactId>dbflute-maven-plugin</artifactId>
<version>1.1.0</version>
<configuration>
<dbfluteVersion>${dbflute.version}</dbfluteVersion>
<packageBase>org.codelibs.fess.db</packageBase>
<clientProject>fess</clientProject>
<dbfluteClientDir>${basedir}/dbflute_fess</dbfluteClientDir>
</configuration>
</plugin>
</plugins>
</build>
<pluginRepositories>
@ -261,16 +243,6 @@
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>${databaseGroupId}</groupId>
<artifactId>${databaseArtifactId}</artifactId>
<version>${databaseVersion}</version>
</dependency>
<dependency>
<groupId>org.codelibs.fess</groupId>
<artifactId>fess-db-${database}</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
@ -451,7 +423,7 @@
</dependency>
<dependency>
<groupId>org.codelibs.robot</groupId>
<artifactId>s2-robot-db-${database}</artifactId>
<artifactId>s2-robot-db-h2</artifactId>
<version>${s2robot.version}</version>
<exclusions>
<exclusion>

View file

@ -1,36 +0,0 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<!-- If this file is found in the config directory, it will only be
loaded once at startup. If it is found in Solr's data
directory, it will be re-loaded every commit.
-->
<elevate>
<query text="foo bar">
<doc id="1" />
<doc id="2" />
<doc id="3" />
</query>
<query text="ipod">
<doc id="MA147LL/A" /> <!-- put the actual ipod at the top -->
<doc id="IW-02" exclude="true" /> <!-- exclude this cable -->
</query>
</elevate>

View file

@ -1,21 +0,0 @@
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#-----------------------------------------------------------------------
# Use a protected word file to protect against the stemmer reducing two
# unrelated words to the same base word.
# Some non-words that normally won't be encountered,
# just to test that they won't be stemmed.
dontstems
zwhacky

View file

@ -1,132 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<schema name="fess" version="1.2">
<types>
<fieldType name="text" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.CJKTokenizerFactory"/>
<!--
<tokenizer class="solr.WhitespaceTokenizerFactory"/>
<tokenizer class="solr.StandardTokenizerFactory"/>
-->
<!--
<filter class="solr.SynonymFilterFactory" synonyms="index_synonyms.txt" ignoreCase="true" expand="false"/>
-->
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" enablePositionIncrements="true" />
<filter class="solr.WordDelimiterFilterFactory" generateWordParts="1" generateNumberParts="1" catenateWords="1" catenateNumbers="1" catenateAll="0" splitOnCaseChange="1"/>
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.SnowballPorterFilterFactory" language="English" protected="protwords.txt"/>
<!-- <filter class="solr.RemoveDuplicatesTokenFilterFactory"/> -->
</analyzer>
<analyzer type="query">
<tokenizer class="solr.WhitespaceTokenizerFactory"/>
<filter class="solr.SynonymFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/>
<filter class="solr.StopFilterFactory"
ignoreCase="true"
words="stopwords.txt"
enablePositionIncrements="true"
/>
<filter class="solr.WordDelimiterFilterFactory" generateWordParts="1" generateNumberParts="1" catenateWords="0" catenateNumbers="0" catenateAll="0" splitOnCaseChange="1"/>
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.SnowballPorterFilterFactory" language="English" protected="protwords.txt"/>
<!-- <filter class="solr.RemoveDuplicatesTokenFilterFactory"/> -->
</analyzer>
</fieldType>
<fieldType name="url" class="solr.TextField" positionIncrementGap="100">
<analyzer>
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.WordDelimiterFilterFactory" generateWordParts="1" generateNumberParts="1"/>
<filter class="solr.RemoveDuplicatesTokenFilterFactory"/>
</analyzer>
</fieldType>
<fieldType name="string" class="solr.StrField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true" omitNorms="true"/>
<fieldtype name="binary" class="solr.BinaryField"/>
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="date" class="solr.TrieDateField" omitNorms="true" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="tdate" class="solr.TrieDateField" omitNorms="true" precisionStep="6" positionIncrementGap="0"/>
<fieldType name="pint" class="solr.IntField" omitNorms="true"/>
<fieldType name="plong" class="solr.LongField" omitNorms="true"/>
<fieldType name="pfloat" class="solr.FloatField" omitNorms="true"/>
<fieldType name="pdouble" class="solr.DoubleField" omitNorms="true"/>
<fieldType name="pdate" class="solr.DateField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="sint" class="solr.SortableIntField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="slong" class="solr.SortableLongField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="sfloat" class="solr.SortableFloatField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="sdouble" class="solr.SortableDoubleField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="random" class="solr.RandomSortField" indexed="true" />
</types>
<fields>
<field name="id" type="string" stored="true" indexed="true"/>
<!-- core fields -->
<field name="segment" type="string" stored="true" indexed="true"/>
<field name="digest" type="text" stored="true" indexed="false"/>
<field name="boost" type="float" stored="true" indexed="false"/>
<field name="host" type="url" stored="true" indexed="true"/>
<field name="site" type="string" stored="true" indexed="false"/>
<field name="url" type="url" stored="true" indexed="true" required="true"/>
<field name="content" type="text" stored="false" indexed="true"/>
<field name="title" type="text" stored="true" indexed="true"/>
<field name="cache" type="text" stored="true" indexed="false" compressed="true"/>
<field name="tstamp" type="slong" stored="true" indexed="true"/>
<field name="anchor" type="string" stored="true" indexed="true" multiValued="true"/>
<field name="contentLength" type="slong" stored="true" indexed="true"/>
<field name="lastModified" type="slong" stored="true" indexed="true"/>
<field name="date" type="string" stored="true" indexed="true"/>
<field name="lang" type="string" stored="true" indexed="true"/>
<field name="mimetype" type="string" stored="true" indexed="true"/>
<!-- multi values -->
<field name="type" type="string" stored="true" indexed="true" multiValued="true"/>
<field name="label" type="string" stored="true" indexed="true" multiValued="true"/>
<field name="role" type="string" stored="true" indexed="true" multiValued="true"/>
<!-- Dynamic field definitions -->
<dynamicField name="*_s" type="string" indexed="true" stored="true"/>
<dynamicField name="*_t" type="text" indexed="true" stored="true"/>
<dynamicField name="*_b" type="boolean" indexed="true" stored="true"/>
<dynamicField name="*_i" type="int" indexed="true" stored="true"/>
<dynamicField name="*_l" type="long" indexed="true" stored="true"/>
<dynamicField name="*_f" type="float" indexed="true" stored="true"/>
<dynamicField name="*_d" type="double" indexed="true" stored="true"/>
<dynamicField name="*_ti" type="tint" indexed="true" stored="true"/>
<dynamicField name="*_tl" type="tlong" indexed="true" stored="true"/>
<dynamicField name="*_tf" type="tfloat" indexed="true" stored="true"/>
<dynamicField name="*_td" type="tdouble" indexed="true" stored="true"/>
<dynamicField name="*_tdt" type="tdate" indexed="true" stored="true"/>
<dynamicField name="*_pi" type="pint" indexed="true" stored="true"/>
<dynamicField name="*_pl" type="plong" indexed="true" stored="true"/>
<dynamicField name="*_pf" type="pfloat" indexed="true" stored="true"/>
<dynamicField name="*_pd" type="pdouble" indexed="true" stored="true"/>
<dynamicField name="*_pdt" type="pdate" indexed="true" stored="true"/>
<dynamicField name="*_si" type="sint" indexed="true" stored="true"/>
<dynamicField name="*_sl" type="slong" indexed="true" stored="true"/>
<dynamicField name="*_sf" type="sfloat" indexed="true" stored="true"/>
<dynamicField name="*_sd" type="sdouble" indexed="true" stored="true"/>
<dynamicField name="*_dt" type="date" indexed="true" stored="true"/>
</fields>
<uniqueKey>id</uniqueKey>
<defaultSearchField>content</defaultSearchField>
<solrQueryParser defaultOperator="AND"/>
</schema>

File diff suppressed because it is too large Load diff

View file

@ -1,57 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#-----------------------------------------------------------------------
# a couple of test stopwords to test that the words are really being
# configured from this file:
stopworda
stopwordb
#Standard english stop words taken from Lucene's StopAnalyzer
an
and
are
as
at
be
but
by
for
if
in
into
is
it
no
not
of
on
or
s
such
t
that
the
their
then
there
these
they
this
to
was
will
with

View file

@ -1,31 +0,0 @@
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#-----------------------------------------------------------------------
#some test synonym mappings unlikely to appear in real input text
aaa => aaaa
bbb => bbbb1 bbbb2
ccc => cccc1,cccc2
a\=>a => b\=>b
a\,a => b\,b
fooaaa,baraaa,bazaaa
# Some synonym groups specific to this example
GB,gib,gigabyte,gigabytes
MB,mib,megabyte,megabytes
Television, Televisions, TV, TVs
#notice we use "gib" instead of "GiB" so any WordDelimiterFilter coming
#after us won't split it into two words.
# Synonym mappings can be used for spelling correction too
pixima => pixma

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,174 @@
{
"search_log" : {
"aliases" : {
"search_log_search" : {},
"search_log_index" : {}
},
"mappings" : {
"user_info" : {
"_all" : {
"enabled" : false
},
"_id" : {
"path" : "id"
},
"properties" : {
"code" : {
"type" : "string",
"index" : "not_analyzed"
},
"createdTime" : {
"type" : "long"
},
"id" : {
"type" : "string",
"index" : "not_analyzed"
},
"updatedTime" : {
"type" : "long"
}
}
},
"search_log" : {
"_all" : {
"enabled" : false
},
"_id" : {
"path" : "id"
},
"properties" : {
"accessType" : {
"type" : "string",
"index" : "not_analyzed"
},
"clientIp" : {
"type" : "string",
"index" : "not_analyzed"
},
"hitCount" : {
"type" : "long"
},
"id" : {
"type" : "string",
"index" : "not_analyzed"
},
"queryOffset" : {
"type" : "integer"
},
"queryPageSize" : {
"type" : "integer"
},
"referer" : {
"type" : "string",
"index" : "not_analyzed"
},
"requestedTime" : {
"type" : "long"
},
"responseTime" : {
"type" : "integer"
},
"searchWord" : {
"type" : "string",
"index" : "not_analyzed"
},
"userAgent" : {
"type" : "string",
"index" : "not_analyzed"
},
"userInfoId" : {
"type" : "string",
"index" : "not_analyzed"
},
"userSessionId" : {
"type" : "string",
"index" : "not_analyzed"
}
}
},
"search_field_log" : {
"_all" : {
"enabled" : false
},
"_id" : {
"path" : "id"
},
"properties" : {
"id" : {
"type" : "string",
"index" : "not_analyzed"
},
"name" : {
"type" : "string",
"index" : "not_analyzed"
},
"searchLogId" : {
"type" : "string",
"index" : "not_analyzed"
},
"value" : {
"type" : "string",
"index" : "not_analyzed"
}
}
},
"favorite_log" : {
"_all" : {
"enabled" : false
},
"_id" : {
"path" : "id"
},
"properties" : {
"createdTime" : {
"type" : "long"
},
"id" : {
"type" : "string",
"index" : "not_analyzed"
},
"url" : {
"type" : "string",
"index" : "not_analyzed"
},
"userInfoId" : {
"type" : "string",
"index" : "not_analyzed"
}
}
},
"click_log" : {
"_all" : {
"enabled" : false
},
"_id" : {
"path" : "id"
},
"properties" : {
"id" : {
"type" : "string",
"index" : "not_analyzed"
},
"requestedTime" : {
"type" : "long"
},
"searchLogId" : {
"type" : "string",
"index" : "not_analyzed"
},
"url" : {
"type" : "string",
"index" : "not_analyzed"
}
}
}
},
"settings" : {
"index" : {
"refresh_interval" : "1m",
"number_of_shards" : "10",
"number_of_replicas" : "0"
}
}
}
}

View file

@ -1,572 +0,0 @@
DROP TABLE IF EXISTS FAVORITE_LOG;
DROP TABLE IF EXISTS SEARCH_FIELD_LOG;
DROP TABLE IF EXISTS FILE_AUTHENTICATION;
DROP TABLE IF EXISTS FAILURE_URL;
DROP TABLE IF EXISTS CLICK_LOG;
DROP TABLE IF EXISTS LABEL_TYPE_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS SEARCH_LOG;
DROP TABLE IF EXISTS USER_INFO;
DROP TABLE IF EXISTS DATA_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CRAWLING_CONFIG;
DROP TABLE IF EXISTS WEB_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS ROLE_TYPE;
DROP TABLE IF EXISTS WEB_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS LABEL_TYPE;
DROP TABLE IF EXISTS CRAWLING_SESSION_INFO;
DROP TABLE IF EXISTS WEB_AUTHENTICATION;
DROP TABLE IF EXISTS KEY_MATCH;
DROP TABLE IF EXISTS BOOST_DOCUMENT_RULE;
DROP TABLE IF EXISTS REQUEST_HEADER;
DROP TABLE IF EXISTS OVERLAPPING_HOST;
DROP TABLE IF EXISTS CRAWLING_SESSION;
DROP TABLE IF EXISTS PATH_MAPPING;
DROP TABLE IF EXISTS JOB_LOG;
DROP TABLE IF EXISTS SCHEDULED_JOB;
DROP TABLE IF EXISTS FILE_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS WEB_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CRAWLING_CONFIG;
DROP TABLE IF EXISTS BROWSER_TYPE;
DROP TABLE IF EXISTS WEB_CRAWLING_CONFIG;
DROP TABLE IF EXISTS SUGGEST_BAD_WORD;
DROP TABLE IF EXISTS SUGGEST_ELAVATE_WORD;
/**********************************/
/* Table Name: Web Crawling Config */
/**********************************/
CREATE TABLE WEB_CRAWLING_CONFIG(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(200) NOT NULL,
URLS VARCHAR(4000) NOT NULL,
INCLUDED_URLS VARCHAR(4000),
EXCLUDED_URLS VARCHAR(4000),
INCLUDED_DOC_URLS VARCHAR(4000),
EXCLUDED_DOC_URLS VARCHAR(4000),
CONFIG_PARAMETER VARCHAR(4000),
DEPTH INTEGER,
MAX_ACCESS_COUNT BIGINT,
USER_AGENT VARCHAR(200) NOT NULL,
NUM_OF_THREAD INTEGER NOT NULL,
INTERVAL_TIME INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File Crawling Config */
/**********************************/
CREATE TABLE FILE_CRAWLING_CONFIG(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(200) NOT NULL,
PATHS VARCHAR(4000) NOT NULL,
INCLUDED_PATHS VARCHAR(4000),
EXCLUDED_PATHS VARCHAR(4000),
INCLUDED_DOC_PATHS VARCHAR(4000),
EXCLUDED_DOC_PATHS VARCHAR(4000),
CONFIG_PARAMETER VARCHAR(4000),
DEPTH INTEGER,
MAX_ACCESS_COUNT BIGINT,
NUM_OF_THREAD INTEGER NOT NULL,
INTERVAL_TIME INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Scheduled Job */
/**********************************/
CREATE TABLE SCHEDULED_JOB(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(100) NOT NULL,
TARGET VARCHAR(100) NOT NULL,
CRON_EXPRESSION VARCHAR(100) NOT NULL,
SCRIPT_TYPE VARCHAR(100) NOT NULL,
SCRIPT_DATA VARCHAR(4000),
CRAWLER VARCHAR(1) NOT NULL,
JOB_LOGGING VARCHAR(1) NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Job Log */
/**********************************/
CREATE TABLE JOB_LOG(
ID IDENTITY NOT NULL PRIMARY KEY,
JOB_NAME VARCHAR(100) NOT NULL,
JOB_STATUS VARCHAR(10) NOT NULL,
TARGET VARCHAR(100) NOT NULL,
SCRIPT_TYPE VARCHAR(100) NOT NULL,
SCRIPT_DATA VARCHAR(4000),
SCRIPT_RESULT VARCHAR(4000),
START_TIME TIMESTAMP NOT NULL,
END_TIME TIMESTAMP
);
/**********************************/
/* Table Name: Path Mapping */
/**********************************/
CREATE TABLE PATH_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
REGEX VARCHAR(1000) NOT NULL,
REPLACEMENT VARCHAR(1000) NOT NULL,
PROCESS_TYPE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Crawling Session */
/**********************************/
CREATE TABLE CRAWLING_SESSION(
ID IDENTITY NOT NULL PRIMARY KEY,
SESSION_ID VARCHAR(20) NOT NULL,
NAME VARCHAR(20),
EXPIRED_TIME TIMESTAMP,
CREATED_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Overlapping Host */
/**********************************/
CREATE TABLE OVERLAPPING_HOST(
ID IDENTITY NOT NULL PRIMARY KEY,
REGULAR_NAME VARCHAR(1000) NOT NULL,
OVERLAPPING_NAME VARCHAR(1000) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Request Header */
/**********************************/
CREATE TABLE REQUEST_HEADER(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(100) NOT NULL,
VALUE VARCHAR(1000) NOT NULL,
WEB_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Boost Document Rule */
/**********************************/
CREATE TABLE BOOST_DOCUMENT_RULE(
ID IDENTITY NOT NULL PRIMARY KEY,
URL_EXPR VARCHAR(4000) NOT NULL,
BOOST_EXPR VARCHAR(4000) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Key Match */
/**********************************/
CREATE TABLE KEY_MATCH(
ID IDENTITY NOT NULL PRIMARY KEY,
TERM VARCHAR(200) NOT NULL,
QUERY VARCHAR(4000) NOT NULL,
MAX_SIZE INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Web Authentication */
/**********************************/
CREATE TABLE WEB_AUTHENTICATION(
ID IDENTITY NOT NULL PRIMARY KEY,
HOSTNAME VARCHAR(100),
PORT INTEGER NOT NULL,
AUTH_REALM VARCHAR(100),
PROTOCOL_SCHEME VARCHAR(10),
USERNAME VARCHAR(100) NOT NULL,
PASSWORD VARCHAR(100),
PARAMETERS VARCHAR(1000),
WEB_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Crawling Session Info */
/**********************************/
CREATE TABLE CRAWLING_SESSION_INFO(
ID IDENTITY NOT NULL PRIMARY KEY,
CRAWLING_SESSION_ID BIGINT NOT NULL,
KEY VARCHAR(20) NOT NULL,
VALUE VARCHAR(100) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (CRAWLING_SESSION_ID) REFERENCES CRAWLING_SESSION (ID)
);
/**********************************/
/* Table Name: Label Type */
/**********************************/
CREATE TABLE LABEL_TYPE(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(100) NOT NULL,
VALUE VARCHAR(20) NOT NULL,
INCLUDED_PATHS VARCHAR(4000),
EXCLUDED_PATHS VARCHAR(4000),
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File To Label Mapping */
/**********************************/
CREATE TABLE FILE_CONFIG_TO_LABEL_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
FILE_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Web To Label Mapping */
/**********************************/
CREATE TABLE WEB_CONFIG_TO_LABEL_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
WEB_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Role Type */
/**********************************/
CREATE TABLE ROLE_TYPE(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(100) NOT NULL,
VALUE VARCHAR(20) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File To Role Mapping */
/**********************************/
CREATE TABLE FILE_CONFIG_TO_ROLE_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
FILE_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Web To Role Mapping */
/**********************************/
CREATE TABLE WEB_CONFIG_TO_ROLE_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
WEB_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Data Crawling Config */
/**********************************/
CREATE TABLE DATA_CRAWLING_CONFIG(
ID IDENTITY NOT NULL PRIMARY KEY,
NAME VARCHAR(200) NOT NULL,
HANDLER_NAME VARCHAR(200) NOT NULL,
HANDLER_PARAMETER VARCHAR(4000),
HANDLER_SCRIPT VARCHAR(4000),
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Data To Role Mapping */
/**********************************/
CREATE TABLE DATA_CONFIG_TO_ROLE_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
DATA_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Data To Label Mapping */
/**********************************/
CREATE TABLE DATA_CONFIG_TO_LABEL_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
DATA_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID)
);
/**********************************/
/* Table Name: User Info */
/**********************************/
CREATE TABLE USER_INFO(
ID IDENTITY NOT NULL PRIMARY KEY,
CODE VARCHAR(1000) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Search Log */
/**********************************/
CREATE TABLE SEARCH_LOG(
ID IDENTITY NOT NULL PRIMARY KEY,
SEARCH_WORD VARCHAR(1000),
REQUESTED_TIME TIMESTAMP NOT NULL,
RESPONSE_TIME INTEGER NOT NULL,
HIT_COUNT BIGINT NOT NULL,
QUERY_OFFSET INTEGER NOT NULL,
QUERY_PAGE_SIZE INTEGER NOT NULL,
USER_AGENT VARCHAR(255),
REFERER VARCHAR(1000),
CLIENT_IP VARCHAR(50),
USER_SESSION_ID VARCHAR(100),
ACCESS_TYPE VARCHAR(1) NOT NULL,
USER_ID BIGINT,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
/**********************************/
/* Table Name: Label Type To Role Type Mapping */
/**********************************/
CREATE TABLE LABEL_TYPE_TO_ROLE_TYPE_MAPPING(
ID IDENTITY NOT NULL PRIMARY KEY,
LABEL_TYPE_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Click Log */
/**********************************/
CREATE TABLE CLICK_LOG(
ID IDENTITY NOT NULL PRIMARY KEY,
SEARCH_ID BIGINT NOT NULL,
URL VARCHAR(4000) NOT NULL,
REQUESTED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
/**********************************/
/* Table Name: Failure Url */
/**********************************/
CREATE TABLE FAILURE_URL(
ID IDENTITY NOT NULL PRIMARY KEY,
URL VARCHAR(4000) NOT NULL,
THREAD_NAME VARCHAR(30) NOT NULL,
ERROR_NAME VARCHAR(255),
ERROR_LOG VARCHAR(4000),
ERROR_COUNT INTEGER NOT NULL,
LAST_ACCESS_TIME TIMESTAMP NOT NULL,
CONFIG_ID VARCHAR(100)
);
/**********************************/
/* Table Name: File Authentication */
/**********************************/
CREATE TABLE FILE_AUTHENTICATION(
ID IDENTITY NOT NULL PRIMARY KEY,
HOSTNAME VARCHAR(255),
PORT INTEGER NOT NULL,
PROTOCOL_SCHEME VARCHAR(10),
USERNAME VARCHAR(100) NOT NULL,
PASSWORD VARCHAR(100),
PARAMETERS VARCHAR(1000),
FILE_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (FILE_CRAWLING_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Search Field Log */
/**********************************/
CREATE TABLE SEARCH_FIELD_LOG(
ID IDENTITY NOT NULL PRIMARY KEY,
SEARCH_ID BIGINT NOT NULL,
NAME VARCHAR(255) NOT NULL,
VALUE VARCHAR(1000) NOT NULL,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
/**********************************/
/* Table Name: Favorite Log */
/**********************************/
CREATE TABLE FAVORITE_LOG(
ID IDENTITY NOT NULL PRIMARY KEY,
USER_ID BIGINT NOT NULL,
URL VARCHAR(4000) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
/**********************************/
/* Table Name: Suggest Ng Word */
/**********************************/
CREATE TABLE SUGGEST_BAD_WORD(
ID IDENTITY NOT NULL PRIMARY KEY,
SUGGEST_WORD VARCHAR(255) NOT NULL,
TARGET_ROLE VARCHAR(255),
TARGET_LABEL VARCHAR(255),
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Suggest Elevate word */
/**********************************/
CREATE TABLE SUGGEST_ELEVATE_WORD(
ID IDENTITY NOT NULL PRIMARY KEY,
SUGGEST_WORD VARCHAR(255) NOT NULL,
READING VARCHAR(255),
TARGET_ROLE VARCHAR(255),
TARGET_LABEL VARCHAR(255),
BOOST FLOAT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
CREATE UNIQUE INDEX UQ_FAVORITE_LOG ON FAVORITE_LOG (USER_ID, URL);
CREATE INDEX IDX_OVERLAPPING_HOST_BY_REGULAR_NAME_AND_SORT_ORDER ON OVERLAPPING_HOST (REGULAR_NAME, SORT_ORDER);
CREATE INDEX IDX_FILE_CONFIG_TO_LABEL_TYPE_MAPPING_FOR_FILE_CONFIG ON FILE_CONFIG_TO_LABEL_TYPE_MAPPING (FILE_CONFIG_ID);
CREATE INDEX IDX_WEB_CONFIG_TO_LABEL_TYPE_MAPPING__FOR_WEB_CONFIG ON WEB_CONFIG_TO_LABEL_TYPE_MAPPING (WEB_CONFIG_ID);
CREATE INDEX IDX_FILE_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_FILE_CONFIG ON FILE_CONFIG_TO_ROLE_TYPE_MAPPING (FILE_CONFIG_ID);
CREATE INDEX IDX_WEB_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_WEB_CONFIG ON WEB_CONFIG_TO_ROLE_TYPE_MAPPING (WEB_CONFIG_ID);
CREATE INDEX IDX_DATA_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_DATA_CONFIG ON DATA_CONFIG_TO_ROLE_TYPE_MAPPING (DATA_CONFIG_ID);
CREATE INDEX IDX_DATA_CONFIG_TO_LABEL_TYPE_MAPPING_FOR_DATA_CONFIG ON DATA_CONFIG_TO_LABEL_TYPE_MAPPING (DATA_CONFIG_ID);
CREATE INDEX IDX_SEARCH_LOG_BY_HIT_COUNT ON SEARCH_LOG (HIT_COUNT);
CREATE INDEX IDX_SEARCH_LOG_BY_RESPONSE_TIME ON SEARCH_LOG (RESPONSE_TIME);
CREATE INDEX IDX_SEARCH_LOG_BY_REQUESTED_TIME ON SEARCH_LOG (REQUESTED_TIME);
CREATE INDEX IDX_SEARCH_LOG_BY_SEARCH_WORD ON SEARCH_LOG (SEARCH_WORD);
CREATE INDEX IDX_SEARCH_LOG_BY_RTIME_USID ON SEARCH_LOG (REQUESTED_TIME, USER_SESSION_ID);
CREATE INDEX IDX_SEARCH_LOG_BY_USER_ID ON SEARCH_LOG (USER_ID);
CREATE INDEX IDX_CLICK_LOG_URL ON CLICK_LOG (URL);
CREATE INDEX IDX_FAILURE_URL_FOR_LIST ON FAILURE_URL (URL, LAST_ACCESS_TIME, ERROR_NAME, ERROR_COUNT);
CREATE INDEX IDX_FAILURE_URL_BY_CONFIG_ID ON FAILURE_URL (CONFIG_ID);
CREATE INDEX IDX_SEARCH_FIELD_LOG_NAME ON SEARCH_FIELD_LOG (NAME);
CREATE INDEX IDX_SESSION_NAME_EXPIRED ON CRAWLING_SESSION (NAME, EXPIRED_TIME);
INSERT INTO PUBLIC.SCHEDULED_JOB(ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, DELETED_BY, DELETED_TIME, VERSION_NO) VALUES
(1, 'Crawler', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("crawlJob").execute(executor);', 'T', 'T', 'T', 0, 'system', TIMESTAMP '2013-01-01 00:00:00.000', 'system', TIMESTAMP '2013-01-01 00:00:00.000', NULL, NULL, 0),
(2, 'Minutely Tasks', 'all', '0 * * * * ?', 'groovy', 'return container.getComponent("aggregateLogJob").execute();', 'F', 'F', 'T', 10, 'system', TIMESTAMP '2013-01-01 00:00:00.000', 'system', TIMESTAMP '2013-01-01 00:00:00.000', NULL, NULL, 0),
(3, 'Hourly Tasks', 'all', '0 0 * * * ?', 'groovy', 'return container.getComponent("updateStatsJob").execute()+container.getComponent("updateHotWordJob").execute();', 'F', 'F', 'T', 20, 'system', TIMESTAMP '2013-01-01 00:00:00.000', 'system', TIMESTAMP '2013-01-01 00:00:00.000', NULL, NULL, 0),
(4, 'Daily Tasks', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("purgeLogJob").execute();', 'F', 'F', 'T', 30, 'system', TIMESTAMP '2013-01-01 00:00:00.000', 'system', TIMESTAMP '2013-01-01 00:00:00.000', NULL, NULL, 0);

View file

@ -1,571 +0,0 @@
DROP TABLE IF EXISTS FAVORITE_LOG;
DROP TABLE IF EXISTS SEARCH_FIELD_LOG;
DROP TABLE IF EXISTS FILE_AUTHENTICATION;
DROP TABLE IF EXISTS FAILURE_URL;
DROP TABLE IF EXISTS CLICK_LOG;
DROP TABLE IF EXISTS LABEL_TYPE_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS SEARCH_LOG;
DROP TABLE IF EXISTS USER_INFO;
DROP TABLE IF EXISTS DATA_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS DATA_CRAWLING_CONFIG;
DROP TABLE IF EXISTS WEB_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CONFIG_TO_ROLE_TYPE_MAPPING;
DROP TABLE IF EXISTS ROLE_TYPE;
DROP TABLE IF EXISTS WEB_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CONFIG_TO_LABEL_TYPE_MAPPING;
DROP TABLE IF EXISTS LABEL_TYPE;
DROP TABLE IF EXISTS CRAWLING_SESSION_INFO;
DROP TABLE IF EXISTS WEB_AUTHENTICATION;
DROP TABLE IF EXISTS KEY_MATCH;
DROP TABLE IF EXISTS BOOST_DOCUMENT_RULE;
DROP TABLE IF EXISTS REQUEST_HEADER;
DROP TABLE IF EXISTS OVERLAPPING_HOST;
DROP TABLE IF EXISTS CRAWLING_SESSION;
DROP TABLE IF EXISTS PATH_MAPPING;
DROP TABLE IF EXISTS JOB_LOG;
DROP TABLE IF EXISTS SCHEDULED_JOB;
DROP TABLE IF EXISTS FILE_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS WEB_CONFIG_TO_BROWSER_TYPE_MAPPING;
DROP TABLE IF EXISTS FILE_CRAWLING_CONFIG;
DROP TABLE IF EXISTS BROWSER_TYPE;
DROP TABLE IF EXISTS WEB_CRAWLING_CONFIG;
DROP TABLE IF EXISTS SUGGEST_BAD_WORD;
DROP TABLE IF EXISTS SUGGEST_ELAVATE_WORD;
/**********************************/
/* Table Name: Web Crawling Config */
/**********************************/
CREATE TABLE WEB_CRAWLING_CONFIG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(200) NOT NULL,
URLS TEXT NOT NULL,
INCLUDED_URLS TEXT,
EXCLUDED_URLS TEXT,
INCLUDED_DOC_URLS TEXT,
EXCLUDED_DOC_URLS TEXT,
CONFIG_PARAMETER TEXT,
DEPTH INTEGER,
MAX_ACCESS_COUNT BIGINT,
USER_AGENT VARCHAR(200) NOT NULL,
NUM_OF_THREAD INTEGER NOT NULL,
INTERVAL_TIME INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File Crawling Config */
/**********************************/
CREATE TABLE FILE_CRAWLING_CONFIG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(200) NOT NULL,
PATHS TEXT NOT NULL,
INCLUDED_PATHS TEXT,
EXCLUDED_PATHS TEXT,
INCLUDED_DOC_PATHS TEXT,
EXCLUDED_DOC_PATHS TEXT,
CONFIG_PARAMETER TEXT,
DEPTH INTEGER,
MAX_ACCESS_COUNT BIGINT,
NUM_OF_THREAD INTEGER NOT NULL,
INTERVAL_TIME INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Scheduled Job */
/**********************************/
CREATE TABLE SCHEDULED_JOB(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(100) NOT NULL,
TARGET VARCHAR(100) NOT NULL,
CRON_EXPRESSION VARCHAR(100) NOT NULL,
SCRIPT_TYPE VARCHAR(100) NOT NULL,
SCRIPT_DATA TEXT,
CRAWLER VARCHAR(1) NOT NULL,
JOB_LOGGING VARCHAR(1) NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Job Log */
/**********************************/
CREATE TABLE JOB_LOG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
JOB_NAME VARCHAR(100) NOT NULL,
JOB_STATUS VARCHAR(10) NOT NULL,
TARGET VARCHAR(100) NOT NULL,
SCRIPT_TYPE VARCHAR(100) NOT NULL,
SCRIPT_DATA TEXT,
SCRIPT_RESULT TEXT,
START_TIME TIMESTAMP NOT NULL,
END_TIME TIMESTAMP
);
/**********************************/
/* Table Name: Path Mapping */
/**********************************/
CREATE TABLE PATH_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
REGEX TEXT NOT NULL,
REPLACEMENT TEXT NOT NULL,
PROCESS_TYPE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Crawling Session */
/**********************************/
CREATE TABLE CRAWLING_SESSION(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SESSION_ID VARCHAR(20) NOT NULL,
NAME VARCHAR(20),
EXPIRED_TIME TIMESTAMP,
CREATED_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Overlapping Host */
/**********************************/
CREATE TABLE OVERLAPPING_HOST(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
REGULAR_NAME TEXT NOT NULL,
OVERLAPPING_NAME TEXT NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Request Header */
/**********************************/
CREATE TABLE REQUEST_HEADER(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(100) NOT NULL,
VALUE TEXT NOT NULL,
WEB_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Boost Document Rule */
/**********************************/
CREATE TABLE BOOST_DOCUMENT_RULE(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
URL_EXPR TEXT NOT NULL,
BOOST_EXPR TEXT NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Key Match */
/**********************************/
CREATE TABLE KEY_MATCH(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
TERM VARCHAR(100) NOT NULL,
QUERY TEXT NOT NULL,
MAX_SIZE INTEGER NOT NULL,
BOOST FLOAT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Web Authentication */
/**********************************/
CREATE TABLE WEB_AUTHENTICATION(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
HOSTNAME VARCHAR(100),
PORT INTEGER NOT NULL,
AUTH_REALM VARCHAR(100),
PROTOCOL_SCHEME VARCHAR(10),
USERNAME VARCHAR(100) NOT NULL,
PASSWORD VARCHAR(100),
PARAMETERS TEXT,
WEB_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Crawling Session Info */
/**********************************/
CREATE TABLE CRAWLING_SESSION_INFO(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
CRAWLING_SESSION_ID BIGINT NOT NULL,
ID_KEY VARCHAR(20) NOT NULL,
VALUE VARCHAR(100) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (CRAWLING_SESSION_ID) REFERENCES CRAWLING_SESSION (ID)
);
/**********************************/
/* Table Name: Label Type */
/**********************************/
CREATE TABLE LABEL_TYPE(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(100) NOT NULL,
VALUE VARCHAR(20) NOT NULL,
INCLUDED_PATHS TEXT,
EXCLUDED_PATHS TEXT,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File To Label Mapping */
/**********************************/
CREATE TABLE FILE_CONFIG_TO_LABEL_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
FILE_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Web To Label Mapping */
/**********************************/
CREATE TABLE WEB_CONFIG_TO_LABEL_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
WEB_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Role Type */
/**********************************/
CREATE TABLE ROLE_TYPE(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(100) NOT NULL,
VALUE VARCHAR(20) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: File To Role Mapping */
/**********************************/
CREATE TABLE FILE_CONFIG_TO_ROLE_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
FILE_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Web To Role Mapping */
/**********************************/
CREATE TABLE WEB_CONFIG_TO_ROLE_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
WEB_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Data Crawling Config */
/**********************************/
CREATE TABLE DATA_CRAWLING_CONFIG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
NAME VARCHAR(200) NOT NULL,
HANDLER_NAME VARCHAR(200) NOT NULL,
HANDLER_PARAMETER TEXT,
HANDLER_SCRIPT TEXT,
BOOST FLOAT NOT NULL,
AVAILABLE VARCHAR(1) NOT NULL,
SORT_ORDER INTEGER NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Data To Role Mapping */
/**********************************/
CREATE TABLE DATA_CONFIG_TO_ROLE_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
DATA_CONFIG_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Data To Label Mapping */
/**********************************/
CREATE TABLE DATA_CONFIG_TO_LABEL_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
DATA_CONFIG_ID BIGINT NOT NULL,
LABEL_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID)
);
/**********************************/
/* Table Name: User Info */
/**********************************/
CREATE TABLE USER_INFO(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
CODE TEXT NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Search Log */
/**********************************/
CREATE TABLE SEARCH_LOG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SEARCH_WORD TEXT,
REQUESTED_TIME TIMESTAMP NOT NULL,
RESPONSE_TIME INTEGER NOT NULL,
HIT_COUNT BIGINT NOT NULL,
QUERY_OFFSET INTEGER NOT NULL,
QUERY_PAGE_SIZE INTEGER NOT NULL,
USER_AGENT VARCHAR(255),
REFERER TEXT,
CLIENT_IP VARCHAR(50),
USER_SESSION_ID VARCHAR(100),
ACCESS_TYPE VARCHAR(1) NOT NULL,
USER_ID BIGINT,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
/**********************************/
/* Table Name: Label Type To Role Type Mapping */
/**********************************/
CREATE TABLE LABEL_TYPE_TO_ROLE_TYPE_MAPPING(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
LABEL_TYPE_ID BIGINT NOT NULL,
ROLE_TYPE_ID BIGINT NOT NULL,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
/**********************************/
/* Table Name: Click Log */
/**********************************/
CREATE TABLE CLICK_LOG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SEARCH_ID BIGINT NOT NULL,
URL TEXT NOT NULL,
REQUESTED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
/**********************************/
/* Table Name: Failure Url */
/**********************************/
CREATE TABLE FAILURE_URL(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
URL TEXT NOT NULL,
THREAD_NAME VARCHAR(30) NOT NULL,
ERROR_NAME VARCHAR(255),
ERROR_LOG TEXT,
ERROR_COUNT INTEGER NOT NULL,
LAST_ACCESS_TIME TIMESTAMP NOT NULL,
CONFIG_ID VARCHAR(100)
);
/**********************************/
/* Table Name: File Authentication */
/**********************************/
CREATE TABLE FILE_AUTHENTICATION(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
HOSTNAME VARCHAR(255),
PORT INTEGER NOT NULL,
PROTOCOL_SCHEME VARCHAR(10),
USERNAME VARCHAR(100) NOT NULL,
PASSWORD VARCHAR(100),
PARAMETERS TEXT,
FILE_CRAWLING_CONFIG_ID BIGINT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL,
FOREIGN KEY (FILE_CRAWLING_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
/**********************************/
/* Table Name: Search Field Log */
/**********************************/
CREATE TABLE SEARCH_FIELD_LOG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SEARCH_ID BIGINT NOT NULL,
NAME VARCHAR(255) NOT NULL,
VALUE TEXT NOT NULL,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
/**********************************/
/* Table Name: Favorite Log */
/**********************************/
CREATE TABLE FAVORITE_LOG(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
USER_ID BIGINT NOT NULL,
URL TEXT NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
/**********************************/
/* Table Name: Suggest Ng Word */
/**********************************/
CREATE TABLE SUGGEST_BAD_WORD(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SUGGEST_WORD VARCHAR(255) NOT NULL,
TARGET_ROLE VARCHAR(255),
TARGET_LABEL VARCHAR(255),
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
/**********************************/
/* Table Name: Suggest Elevate word */
/**********************************/
CREATE TABLE SUGGEST_ELEVATE_WORD(
ID BIGINT NOT NULL PRIMARY KEY AUTO_INCREMENT,
SUGGEST_WORD VARCHAR(255) NOT NULL,
READING VARCHAR(255),
TARGET_ROLE VARCHAR(255),
TARGET_LABEL VARCHAR(255),
BOOST FLOAT NOT NULL,
CREATED_BY VARCHAR(255) NOT NULL,
CREATED_TIME TIMESTAMP NOT NULL,
UPDATED_BY VARCHAR(255),
UPDATED_TIME TIMESTAMP,
DELETED_BY VARCHAR(255),
DELETED_TIME TIMESTAMP,
VERSION_NO INTEGER NOT NULL
);
CREATE UNIQUE INDEX UQ_FAVORITE_LOG ON FAVORITE_LOG (USER_ID, URL(200));
CREATE INDEX IDX_OVERLAPPING_HOST_BY_REGULAR_NAME_AND_SORT_ORDER ON OVERLAPPING_HOST (REGULAR_NAME(200), SORT_ORDER);
CREATE INDEX IDX_FILE_CONFIG_TO_LABEL_TYPE_MAPPING_FOR_FILE_CONFIG ON FILE_CONFIG_TO_LABEL_TYPE_MAPPING (FILE_CONFIG_ID);
CREATE INDEX IDX_WEB_CONFIG_TO_LABEL_TYPE_MAPPING__FOR_WEB_CONFIG ON WEB_CONFIG_TO_LABEL_TYPE_MAPPING (WEB_CONFIG_ID);
CREATE INDEX IDX_FILE_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_FILE_CONFIG ON FILE_CONFIG_TO_ROLE_TYPE_MAPPING (FILE_CONFIG_ID);
CREATE INDEX IDX_WEB_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_WEB_CONFIG ON WEB_CONFIG_TO_ROLE_TYPE_MAPPING (WEB_CONFIG_ID);
CREATE INDEX IDX_DATA_CONFIG_TO_ROLE_TYPE_MAPPING_FOR_DATA_CONFIG ON DATA_CONFIG_TO_ROLE_TYPE_MAPPING (DATA_CONFIG_ID);
CREATE INDEX IDX_DATA_CONFIG_TO_LABEL_TYPE_MAPPING_FOR_DATA_CONFIG ON DATA_CONFIG_TO_LABEL_TYPE_MAPPING (DATA_CONFIG_ID);
CREATE INDEX IDX_SEARCH_LOG_BY_HIT_COUNT ON SEARCH_LOG (HIT_COUNT);
CREATE INDEX IDX_SEARCH_LOG_BY_RESPONSE_TIME ON SEARCH_LOG (RESPONSE_TIME);
CREATE INDEX IDX_SEARCH_LOG_BY_REQUESTED_TIME ON SEARCH_LOG (REQUESTED_TIME);
CREATE INDEX IDX_SEARCH_LOG_BY_SEARCH_WORD ON SEARCH_LOG (SEARCH_WORD(255));
CREATE INDEX IDX_SEARCH_LOG_BY_RTIME_USID ON SEARCH_LOG (REQUESTED_TIME, USER_SESSION_ID);
CREATE INDEX IDX_SEARCH_LOG_BY_USER_ID ON SEARCH_LOG (USER_ID);
CREATE INDEX IDX_CLICK_LOG_URL ON CLICK_LOG (URL(255));
CREATE INDEX IDX_FAILURE_URL_FOR_LIST ON FAILURE_URL (URL(200), LAST_ACCESS_TIME, ERROR_NAME(100), ERROR_COUNT);
CREATE INDEX IDX_FAILURE_URL_BY_CONFIG_ID ON FAILURE_URL (CONFIG_ID);
CREATE INDEX IDX_SEARCH_FIELD_LOG_NAME ON SEARCH_FIELD_LOG (NAME);
CREATE INDEX IDX_SESSION_NAME_EXPIRED ON CRAWLING_SESSION (NAME, EXPIRED_TIME);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (1, 'Crawler', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("crawlJob").execute(executor);', 'T', 'T', 'T', 0, 'system', '2000-01-01 00:00:00', 'system', '2000-01-01 00:00:00', 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (2, 'Minutely Tasks', 'all', '0 * * * * ?', 'groovy', 'return container.getComponent("aggregateLogJob").execute();', 'F', 'F', 'T', 10, 'system', '2000-01-01 00:00:00', 'system', '2000-01-01 00:00:00', 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (3, 'Hourly Tasks', 'all', '0 0 * * * ?', 'groovy', 'return container.getComponent("updateStatsJob").execute()+container.getComponent("updateHotWordJob").execute();', 'F', 'F', 'T', 20, 'system', '2000-01-01 00:00:00', 'system', '2000-01-01 00:00:00', 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (4, 'Daily Tasks', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("purgeLogJob").execute();', 'F', 'F', 'T', 30, 'system', '2000-01-01 00:00:00', 'system', '2000-01-01 00:00:00', 0);

View file

@ -1,16 +0,0 @@
<project basedir=".">
<property name="sql.driver" value="oracle.jdbc.driver.OracleDriver"/>
<property name="sql.url" value="jdbc:oracle:thin:@localhost:1521:XE"/>
<property name="sql.user" value="fess"/>
<property name="sql.pass" value="fess"/>
<target name="create">
<sql driver="${sql.driver}" url="${sql.url}" userid="${sql.user}" password="${sql.pass}" onerror="continue">
<classpath>
<pathelement location="${oracle.jar.file}"/>
</classpath>
<transaction src="fess.ddl"/>
</sql>
</target>
</project>

View file

@ -1,606 +0,0 @@
DROP SEQUENCE WEB_CRAWLING_CONFIG_SEQ;
DROP SEQUENCE FILE_CRAWLING_CONFIG_SEQ;
DROP SEQUENCE JOB_LOG_SEQ;
DROP SEQUENCE SCHEDULED_JOB_SEQ;
DROP SEQUENCE PATH_MAPPING_SEQ;
DROP SEQUENCE CRAWLING_SESSION_SEQ;
DROP SEQUENCE OVERLAPPING_HOST_SEQ;
DROP SEQUENCE REQUEST_HEADER_SEQ;
DROP SEQUENCE KEY_MATCH_SEQ;
DROP SEQUENCE BOOST_DOCUMENT_RULE_SEQ;
DROP SEQUENCE WEB_AUTHENTICATION_SEQ;
DROP SEQUENCE CRAWLING_SESSION_INFO_SEQ;
DROP SEQUENCE LABEL_TYPE_SEQ;
DROP SEQUENCE FILE_CONFIG_TO_LABEL_TYPE_SEQ;
DROP SEQUENCE WEB_CONFIG_TO_LABEL_TYPE_SEQ;
DROP SEQUENCE ROLE_TYPE_SEQ;
DROP SEQUENCE FILE_CONFIG_TO_ROLE_TYPE_SEQ;
DROP SEQUENCE WEB_CONFIG_TO_ROLE_TYPE_SEQ;
DROP SEQUENCE DATA_CRAWLING_CONFIG_SEQ;
DROP SEQUENCE DATA_CONFIG_TO_ROLE_TYPE_SEQ;
DROP SEQUENCE DATA_CONFIG_TO_LABEL_TYPE_SEQ;
DROP SEQUENCE USER_INFO_SEQ;
DROP SEQUENCE SEARCH_LOG_SEQ;
DROP SEQUENCE LABEL_TYPE_TO_ROLE_TYPE_SEQ;
DROP SEQUENCE CLICK_LOG_SEQ;
DROP SEQUENCE FAILURE_URL_SEQ;
DROP SEQUENCE FILE_AUTHENTICATION_SEQ;
DROP SEQUENCE SEARCH_FIELD_LOG_SEQ;
DROP SEQUENCE FAVORITE_LOG_SEQ;
DROP SEQUENCE SUGGEST_BAD_WORD;
DROP SEQUENCE SUGGEST_ELAVATE_WORD;
DROP TABLE "FAVORITE_LOG";
DROP TABLE "SEARCH_FIELD_LOG";
DROP TABLE "FILE_AUTHENTICATION";
DROP TABLE "FAILURE_URL";
DROP TABLE "CLICK_LOG";
DROP TABLE "LABEL_TYPE_TO_ROLE_TYPE";
DROP TABLE "SEARCH_LOG";
DROP TABLE "USER_INFO";
DROP TABLE "DATA_CONFIG_TO_LABEL_TYPE";
DROP TABLE "DATA_CONFIG_TO_ROLE_TYPE";
DROP TABLE "DATA_CRAWLING_CONFIG";
DROP TABLE "WEB_CONFIG_TO_ROLE_TYPE";
DROP TABLE "FILE_CONFIG_TO_ROLE_TYPE";
DROP TABLE "ROLE_TYPE";
DROP TABLE "WEB_CONFIG_TO_LABEL_TYPE";
DROP TABLE "FILE_CONFIG_TO_LABEL_TYPE";
DROP TABLE "LABEL_TYPE";
DROP TABLE "CRAWLING_SESSION_INFO";
DROP TABLE "WEB_AUTHENTICATION";
DROP TABLE "KEY_MATCH";
DROP TABLE "BOOST_DOCUMENT_RULE";
DROP TABLE "REQUEST_HEADER";
DROP TABLE "OVERLAPPING_HOST";
DROP TABLE "CRAWLING_SESSION";
DROP TABLE "PATH_MAPPING";
DROP TABLE "SCHEDULED_JOB";
DROP TABLE "JOB_LOG";
DROP TABLE "FILE_CRAWLING_CONFIG";
DROP TABLE "WEB_CRAWLING_CONFIG";
DROP TABLE "SUGGEST_BAD_WORD";
DROP TABLE "SUGGEST_ELAVATE_WORD";
CREATE TABLE "WEB_CRAWLING_CONFIG"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(200) NOT NULL,
"URLS" VARCHAR2(4000) NOT NULL,
"INCLUDED_URLS" VARCHAR2(4000),
"EXCLUDED_URLS" VARCHAR2(4000),
"INCLUDED_DOC_URLS" VARCHAR2(4000),
"EXCLUDED_DOC_URLS" VARCHAR2(4000),
"CONFIG_PARAMETER" VARCHAR2(4000),
"DEPTH" NUMBER(7,0),
"MAX_ACCESS_COUNT" NUMBER(18,0),
"USER_AGENT" VARCHAR2(200) NOT NULL,
"NUM_OF_THREAD" NUMBER(9,0) NOT NULL,
"INTERVAL_TIME" NUMBER(9,0) NOT NULL,
"BOOST" FLOAT NOT NULL,
"AVAILABLE" VARCHAR2(1) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "WEB_CRAWLING_CONFIG_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "FILE_CRAWLING_CONFIG"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(200) NOT NULL,
"PATHS" VARCHAR2(4000) NOT NULL,
"INCLUDED_PATHS" VARCHAR2(4000),
"EXCLUDED_PATHS" VARCHAR2(4000),
"INCLUDED_DOC_PATHS" VARCHAR2(4000),
"EXCLUDED_DOC_PATHS" VARCHAR2(4000),
"CONFIG_PARAMETER" VARCHAR2(4000),
"DEPTH" NUMBER(9,0),
"MAX_ACCESS_COUNT" NUMBER(18,0),
"NUM_OF_THREAD" NUMBER(9,0) NOT NULL,
"INTERVAL_TIME" NUMBER(9,0) NOT NULL,
"BOOST" FLOAT NOT NULL,
"AVAILABLE" VARCHAR2(1) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "FILE_CRAWLING_CONFIG_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "SCHEDULED_JOB"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(100) NOT NULL,
"TARGET" VARCHAR2(100) NOT NULL,
"CRON_EXPRESSION" VARCHAR2(100) NOT NULL,
"SCRIPT_TYPE" VARCHAR2(100) NOT NULL,
"SCRIPT_DATA" VARCHAR2(4000),
"CRAWLER" VARCHAR2(1) NOT NULL,
"JOB_LOGGING" VARCHAR2(1) NOT NULL,
"AVAILABLE" VARCHAR2(1) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "SCHEDULED_JOB_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "JOB_LOG"(
"ID" NUMBER(18,0) NOT NULL,
"JOB_NAME" VARCHAR2(100) NOT NULL,
"JOB_STATUS" VARCHAR2(10) NOT NULL,
"TARGET" VARCHAR2(100) NOT NULL,
"SCRIPT_TYPE" VARCHAR2(100) NOT NULL,
"SCRIPT_DATA" VARCHAR2(4000),
"SCRIPT_RESULT" VARCHAR2(4000),
"START_TIME" TIMESTAMP NOT NULL,
"END_TIME" TIMESTAMP,
CONSTRAINT "JOB_LOG_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "PATH_MAPPING"(
"ID" NUMBER(18,0) NOT NULL,
"REGEX" VARCHAR2(1000) NOT NULL,
"REPLACEMENT" VARCHAR2(1000) NOT NULL,
"PROCESS_TYPE" VARCHAR2(1) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "PATH_MAPPING_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "CRAWLING_SESSION"(
"ID" NUMBER(18,0) NOT NULL,
"SESSION_ID" VARCHAR2(20) NOT NULL,
"NAME" VARCHAR2(20),
"EXPIRED_TIME" TIMESTAMP,
"CREATED_TIME" TIMESTAMP NOT NULL,
CONSTRAINT "CRAWLING_SESSION_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "OVERLAPPING_HOST"(
"ID" NUMBER(18,0) NOT NULL,
"REGULAR_NAME" VARCHAR2(1000) NOT NULL,
"OVERLAPPING_NAME" VARCHAR2(1000) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "OVERLAPPING_HOST_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "REQUEST_HEADER"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(100) NOT NULL,
"VALUE" VARCHAR2(1000) NOT NULL,
"WEB_CRAWLING_CONFIG_ID" NUMBER(18,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "REQUEST_HEADER_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
CREATE TABLE "BOOST_DOCUMENT_RULE"(
"ID" NUMBER(18,0) NOT NULL,
"URL_EXPR" VARCHAR2(4000) NOT NULL,
"BOOST_EXPR" VARCHAR2(4000) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "BOOST_DOCUMENT_RULE_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "KEY_MATCH"(
"ID" NUMBER(18,0) NOT NULL,
"TERM" VARCHAR2(100) NOT NULL,
"QUERY" VARCHAR2(4000) NOT NULL,
"MAX_SIZE" NUMBER(9,0) NOT NULL,
"BOOST" FLOAT NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "KEY_MATCH_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "WEB_AUTHENTICATION"(
"ID" NUMBER(18,0) NOT NULL,
"HOSTNAME" VARCHAR2(100),
"PORT" NUMBER(9,0) NOT NULL,
"AUTH_REALM" VARCHAR2(100),
"PROTOCOL_SCHEME" VARCHAR2(10),
"USERNAME" VARCHAR2(100) NOT NULL,
"PASSWORD" VARCHAR2(100),
"PARAMETERS" VARCHAR2(1000),
"WEB_CRAWLING_CONFIG_ID" NUMBER(18,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "WEB_AUTHENTICATION_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (WEB_CRAWLING_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
CREATE TABLE "CRAWLING_SESSION_INFO"(
"ID" NUMBER(18,0) NOT NULL,
"CRAWLING_SESSION_ID" NUMBER(18,0) NOT NULL,
"KEY" VARCHAR2(20) NOT NULL,
"VALUE" VARCHAR2(100) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
CONSTRAINT "CRAWLING_SESSION_INFO_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (CRAWLING_SESSION_ID) REFERENCES CRAWLING_SESSION (ID)
);
CREATE TABLE "LABEL_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(100) NOT NULL,
"VALUE" VARCHAR2(20) NOT NULL,
"INCLUDED_PATHS" VARCHAR2(4000),
"EXCLUDED_PATHS" VARCHAR2(4000),
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "LABEL_TYPE_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "FILE_CONFIG_TO_LABEL_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"FILE_CONFIG_ID" NUMBER(18,0) NOT NULL,
"LABEL_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "FILE_CONFIG_TO_LABEL_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
CREATE TABLE "WEB_CONFIG_TO_LABEL_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"WEB_CONFIG_ID" NUMBER(18,0) NOT NULL,
"LABEL_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "WEB_CONFIG_TO_LABEL_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID)
);
CREATE TABLE "ROLE_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(100) NOT NULL,
"VALUE" VARCHAR2(20) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "ROLE_TYPE_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "FILE_CONFIG_TO_ROLE_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"FILE_CONFIG_ID" NUMBER(18,0) NOT NULL,
"ROLE_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "FILE_CONFIG_TO_ROLE_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (FILE_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
CREATE TABLE "WEB_CONFIG_TO_ROLE_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"WEB_CONFIG_ID" NUMBER(18,0) NOT NULL,
"ROLE_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "WEB_CONFIG_TO_ROLE_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (WEB_CONFIG_ID) REFERENCES WEB_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
CREATE TABLE "DATA_CRAWLING_CONFIG"(
"ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(200) NOT NULL,
"HANDLER_NAME" VARCHAR2(200) NOT NULL,
"HANDLER_PARAMETER" VARCHAR2(4000),
"HANDLER_SCRIPT" VARCHAR2(4000),
"BOOST" FLOAT NOT NULL,
"AVAILABLE" VARCHAR2(1) NOT NULL,
"SORT_ORDER" NUMBER(9,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "DATA_CRAWLING_CONFIG_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "DATA_CONFIG_TO_ROLE_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"DATA_CONFIG_ID" NUMBER(18,0) NOT NULL,
"ROLE_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "DATA_CONFIG_TO_ROLE_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
CREATE TABLE "DATA_CONFIG_TO_LABEL_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"DATA_CONFIG_ID" NUMBER(18,0) NOT NULL,
"LABEL_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "DATA_CONFIG_TO_LABEL_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (DATA_CONFIG_ID) REFERENCES DATA_CRAWLING_CONFIG (ID),
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID)
);
CREATE TABLE "USER_INFO" (
"ID" NUMBER(18,0) NOT NULL,
"CODE" VARCHAR2(1000) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_TIME" TIMESTAMP NOT NULL,
CONSTRAINT "USER_INFO_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "SEARCH_LOG"(
"ID" NUMBER(18,0) NOT NULL,
"SEARCH_WORD" VARCHAR2(1000),
"REQUESTED_TIME" TIMESTAMP NOT NULL,
"RESPONSE_TIME" NUMBER(9,0) NOT NULL,
"HIT_COUNT" NUMBER(18,0) NOT NULL,
"QUERY_OFFSET" NUMBER(9,0) NOT NULL,
"QUERY_PAGE_SIZE" NUMBER(9,0) NOT NULL,
"USER_AGENT" VARCHAR2(255),
"REFERER" VARCHAR2(1000),
"CLIENT_IP" VARCHAR2(50),
"USER_SESSION_ID" VARCHAR2(100),
"ACCESS_TYPE" VARCHAR2(1) NOT NULL,
"USER_ID" NUMBER(18,0),
CONSTRAINT "SEARCH_LOG_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
CREATE TABLE "LABEL_TYPE_TO_ROLE_TYPE"(
"ID" NUMBER(18,0) NOT NULL,
"LABEL_TYPE_ID" NUMBER(18,0) NOT NULL,
"ROLE_TYPE_ID" NUMBER(18,0) NOT NULL,
CONSTRAINT "LABEL_TYPE_TO_ROLE_TYPE_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (LABEL_TYPE_ID) REFERENCES LABEL_TYPE (ID),
FOREIGN KEY (ROLE_TYPE_ID) REFERENCES ROLE_TYPE (ID)
);
CREATE TABLE "CLICK_LOG"(
"ID" NUMBER(18,0) NOT NULL,
"SEARCH_ID" NUMBER(18,0) NOT NULL,
"URL" VARCHAR2(4000) NOT NULL,
"REQUESTED_TIME" TIMESTAMP NOT NULL,
CONSTRAINT "CLICK_LOG_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
CREATE TABLE "FAILURE_URL"(
"ID" NUMBER(18,0) NOT NULL,
"URL" VARCHAR2(4000) NOT NULL,
"THREAD_NAME" VARCHAR2(30) NOT NULL,
"ERROR_NAME" VARCHAR2(255),
"ERROR_LOG" VARCHAR2(4000),
"ERROR_COUNT" NUMBER(9,0) NOT NULL,
"LAST_ACCESS_TIME" TIMESTAMP NOT NULL,
"CONFIG_ID" VARCHAR2(100),
CONSTRAINT "FAILURE_URL_PK" PRIMARY KEY ("ID") ENABLE
);
CREATE TABLE "FILE_AUTHENTICATION"(
"ID" NUMBER(18,0) NOT NULL,
"HOSTNAME" VARCHAR2(255),
"PORT" NUMBER(9,0) NOT NULL,
"PROTOCOL_SCHEME" VARCHAR2(10),
"USERNAME" VARCHAR2(100) NOT NULL,
"PASSWORD" VARCHAR2(100),
"PARAMETERS" VARCHAR2(1000),
"FILE_CRAWLING_CONFIG_ID" NUMBER(18,0) NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "FILE_AUTHENTICATION_PK" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (FILE_CRAWLING_CONFIG_ID) REFERENCES FILE_CRAWLING_CONFIG (ID)
);
CREATE TABLE "SEARCH_FIELD_LOG" (
"ID" NUMBER(18,0) NOT NULL,
"SEARCH_ID" NUMBER(18,0) NOT NULL,
"NAME" VARCHAR2(255) NOT NULL,
"VALUE" VARCHAR2(4000) NOT NULL,
CONSTRAINT "SEARCH_FIELD_LOG" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (SEARCH_ID) REFERENCES SEARCH_LOG (ID)
);
CREATE TABLE "FAVORITE_LOG" (
"ID" NUMBER(18,0) NOT NULL,
"USER_ID" NUMBER(18,0) NOT NULL,
"URL" VARCHAR2(4000) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
CONSTRAINT "FAVORITE_LOG" PRIMARY KEY ("ID") ENABLE,
FOREIGN KEY (USER_ID) REFERENCES USER_INFO (ID)
);
/**********************************/
/* Table Name: Suggest Ng Word */
/**********************************/
CREATE TABLE "SUGGEST_BAD_WORD" (
"ID" NUMBER(18,0) NOT NULL,
"SUGGEST_WORD" VARCHAR2(255) NOT NULL,
"TARGET_ROLE" VARCHAR2(255),
"TARGET_LABEL" VARCHAR2(255),
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "SUGGEST_BAD_WORD" PRIMARY KEY ("ID") ENABLE
);
/**********************************/
/* Table Name: Suggest Elevate word */
/**********************************/
CREATE TABLE "SUGGEST_ELEVATE_WORD" (
"ID" NUMBER(18,0) NOT NULL,
"SUGGEST_WORD" VARCHAR2(255) NOT NULL,
"READING" VARCHAR2(255),
"TARGET_ROLE" VARCHAR2(255),
"TARGET_LABEL" VARCHAR2(255),
"BOOST" FLOAT NOT NULL,
"CREATED_BY" VARCHAR2(255) NOT NULL,
"CREATED_TIME" TIMESTAMP NOT NULL,
"UPDATED_BY" VARCHAR2(255),
"UPDATED_TIME" TIMESTAMP,
"DELETED_BY" VARCHAR2(255),
"DELETED_TIME" TIMESTAMP,
"VERSION_NO" NUMBER(9,0) NOT NULL,
CONSTRAINT "SUGGEST_ELEVATE_WORD" PRIMARY KEY ("ID") ENABLE
);
CREATE UNIQUE INDEX UQ_FAVORITE_LOG ON FAVORITE_LOG (USER_ID, URL);
CREATE INDEX IDX_O_H_BY_R_N_AND_S_O ON OVERLAPPING_HOST (REGULAR_NAME, SORT_ORDER);
CREATE INDEX IDX_F_C_TO_L_T_FOR_F_C ON FILE_CONFIG_TO_LABEL_TYPE (FILE_CONFIG_ID);
CREATE INDEX IDX_W_C_TO_L_T_FOR_W_C ON WEB_CONFIG_TO_LABEL_TYPE (WEB_CONFIG_ID);
CREATE INDEX IDX_F_C_TO_R_T_FOR_F_C ON FILE_CONFIG_TO_ROLE_TYPE (FILE_CONFIG_ID);
CREATE INDEX IDX_W_C_TO_R_T_FOR_W_C ON WEB_CONFIG_TO_ROLE_TYPE (WEB_CONFIG_ID);
CREATE INDEX IDX_D_C_TO_R_T_FOR_D_C ON DATA_CONFIG_TO_ROLE_TYPE (DATA_CONFIG_ID);
CREATE INDEX IDX_D_C_TO_L_T_FOR_D_C ON DATA_CONFIG_TO_LABEL_TYPE (DATA_CONFIG_ID);
CREATE INDEX IDX_S_L_BY_H_C ON SEARCH_LOG (HIT_COUNT);
CREATE INDEX IDX_S_L_BY_R_T ON SEARCH_LOG (RESPONSE_TIME);
CREATE INDEX IDX_S_L_BY_RT ON SEARCH_LOG (REQUESTED_TIME);
CREATE INDEX IDX_S_L_BY_S_W ON SEARCH_LOG (SEARCH_WORD);
CREATE INDEX IDX_S_L_BY_RT_USID ON SEARCH_LOG (REQUESTED_TIME, USER_SESSION_ID);
CREATE INDEX IDX_S_L_BY_USID ON SEARCH_LOG (USER_ID);
CREATE INDEX IDX_C_L_URL ON CLICK_LOG (URL);
CREATE INDEX IDX_F_U_FOR_L ON FAILURE_URL (URL, LAST_ACCESS_TIME, ERROR_NAME, ERROR_COUNT);
CREATE INDEX IDX_F_U_BY_W_C_ID ON FAILURE_URL (CONFIG_ID);
CREATE INDEX IDX_S_F_LOG_NAME ON SEARCH_FIELD_LOG (NAME);
CREATE INDEX IDX_S_N_EXPIRED ON CRAWLING_SESSION (NAME, EXPIRED_TIME);
CREATE SEQUENCE WEB_CRAWLING_CONFIG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FILE_CRAWLING_CONFIG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE JOB_LOG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE SCHEDULED_JOB_SEQ START WITH 5 INCREMENT BY 50;
CREATE SEQUENCE PATH_MAPPING_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE CRAWLING_SESSION_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE OVERLAPPING_HOST_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE REQUEST_HEADER_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE BOOST_DOCUMENT_RULE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE KEY_MATCH_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE WEB_AUTHENTICATION_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE CRAWLING_SESSION_INFO_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE LABEL_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FILE_CONFIG_TO_LABEL_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE WEB_CONFIG_TO_LABEL_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE ROLE_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FILE_CONFIG_TO_ROLE_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE WEB_CONFIG_TO_ROLE_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE DATA_CRAWLING_CONFIG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE DATA_CONFIG_TO_ROLE_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE DATA_CONFIG_TO_LABEL_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE USER_INFO_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE SEARCH_LOG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE LABEL_TYPE_TO_ROLE_TYPE_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE CLICK_LOG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FAILURE_URL_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FILE_AUTHENTICATION_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE SEARCH_FIELD_LOG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE FAVORITE_LOG_SEQ START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE SUGGEST_BAD_WORD START WITH 1 INCREMENT BY 50;
CREATE SEQUENCE SUGGEST_ELEVATE_WORD START WITH 1 INCREMENT BY 50;
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (1, 'Crawler', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("crawlJob").execute(executor);', 'T', 'T', 'T', 0, 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (2, 'Minutely Tasks', 'all', '0 * * * * ?', 'groovy', 'return container.getComponent("aggregateLogJob").execute();', 'F', 'F', 'T', 10, 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (3, 'Hourly Tasks', 'all', '0 0 * * * ?', 'groovy', 'return container.getComponent("updateStatsJob").execute()+container.getComponent("updateHotWordJob").execute();', 'F', 'F', 'T', 20, 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 0);
INSERT INTO SCHEDULED_JOB (ID, NAME, TARGET, CRON_EXPRESSION, SCRIPT_TYPE, SCRIPT_DATA, CRAWLER, JOB_LOGGING, AVAILABLE, SORT_ORDER, CREATED_BY, CREATED_TIME, UPDATED_BY, UPDATED_TIME, VERSION_NO) VALUES (4, 'Daily Tasks', 'all', '0 0 0 * * ?', 'groovy', 'return container.getComponent("purgeLogJob").execute();', 'F', 'F', 'T', 30, 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 'system', to_date('2000-01-01', 'yyyy-MM-dd'), 0);

View file

@ -1,15 +0,0 @@
#!/bin/bash
# the following is in fess-server/build.xml.
FILE=dbflute_oracle/schema/project-schema-fess.xml
perl -pi -e 's/name="DATA_CONFIG_TO_LABEL_TYPE"/javaName="DataConfigToLabelTypeMapping" name="DATA_CONFIG_TO_LABEL_TYPE"/' $FILE
perl -pi -e 's/name="DATA_CONFIG_TO_ROLE_TYPE"/javaName="DataConfigToRoleTypeMapping" name="DATA_CONFIG_TO_ROLE_TYPE"/' $FILE
perl -pi -e 's/name="FILE_CONFIG_TO_LABEL_TYPE"/javaName="FileConfigToLabelTypeMapping" name="FILE_CONFIG_TO_LABEL_TYPE"/' $FILE
perl -pi -e 's/name="FILE_CONFIG_TO_ROLE_TYPE"/javaName="FileConfigToRoleTypeMapping" name="FILE_CONFIG_TO_ROLE_TYPE"/' $FILE
perl -pi -e 's/name="LABEL_TYPE_TO_ROLE_TYPE"/javaName="LabelTypeToRoleTypeMapping" name="LABEL_TYPE_TO_ROLE_TYPE"/' $FILE
perl -pi -e 's/name="WEB_CONFIG_TO_LABEL_TYPE"/javaName="WebConfigToLabelTypeMapping" name="WEB_CONFIG_TO_LABEL_TYPE"/' $FILE
perl -pi -e 's/name="WEB_CONFIG_TO_ROLE_TYPE"/javaName="WebConfigToRoleTypeMapping" name="WEB_CONFIG_TO_ROLE_TYPE"/' $FILE
perl -pi -e 's/dbType="FLOAT" javaType="java.math.BigDecimal"/dbType="FLOAT" javaType="Float"/g' $FILE

View file

@ -1,74 +0,0 @@
DROP TABLE IF EXISTS URL_FILTER;
DROP TABLE IF EXISTS ACCESS_RESULT_DATA;
DROP TABLE IF EXISTS ACCESS_RESULT;
DROP TABLE IF EXISTS URL_QUEUE;
/**********************************/
/* Table Name: URL Queue */
/**********************************/
CREATE TABLE URL_QUEUE(
ID IDENTITY NOT NULL PRIMARY KEY,
SESSION_ID VARCHAR(20) NOT NULL,
METHOD VARCHAR(10) NOT NULL,
URL VARCHAR(65536) NOT NULL,
META_DATA VARCHAR(65536),
ENCODING VARCHAR(20),
PARENT_URL VARCHAR(65536),
DEPTH INTEGER NOT NULL,
LAST_MODIFIED TIMESTAMP,
CREATE_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Access Result */
/**********************************/
CREATE TABLE ACCESS_RESULT(
ID IDENTITY NOT NULL PRIMARY KEY,
SESSION_ID VARCHAR(20) NOT NULL,
RULE_ID VARCHAR(20),
URL VARCHAR(65536) NOT NULL,
PARENT_URL VARCHAR(65536),
STATUS INTEGER NOT NULL,
HTTP_STATUS_CODE INTEGER NOT NULL,
METHOD VARCHAR(10) NOT NULL,
MIME_TYPE VARCHAR(100) NOT NULL,
CONTENT_LENGTH BIGINT NOT NULL,
EXECUTION_TIME INTEGER NOT NULL,
LAST_MODIFIED TIMESTAMP,
CREATE_TIME TIMESTAMP NOT NULL
);
/**********************************/
/* Table Name: Access Result Data */
/**********************************/
CREATE TABLE ACCESS_RESULT_DATA(
ID BIGINT(20) NOT NULL PRIMARY KEY,
TRANSFORMER_NAME VARCHAR(255) NOT NULL,
DATA BLOB,
ENCODING VARCHAR(20),
FOREIGN KEY (ID) REFERENCES ACCESS_RESULT (ID)
);
/**********************************/
/* Table Name: URL Filter */
/**********************************/
CREATE TABLE URL_FILTER(
ID IDENTITY NOT NULL PRIMARY KEY,
SESSION_ID VARCHAR(20) NOT NULL,
URL VARCHAR(65536) NOT NULL,
FILTER_TYPE VARCHAR(1) NOT NULL,
CREATE_TIME TIMESTAMP NOT NULL
);
CREATE INDEX IDX_URL_QUEUE_SESSION_ID_AND_TIME ON URL_QUEUE (SESSION_ID, CREATE_TIME);
CREATE INDEX IDX_URL_QUEUE_SESSION_ID_AND_URL ON URL_QUEUE (SESSION_ID, URL);
CREATE INDEX IDX_URL_QUEUE_SESSION_ID ON URL_QUEUE (SESSION_ID);
CREATE INDEX IDX_ACCESS_RESULT_SESSION_ID_AND_TIME ON ACCESS_RESULT (SESSION_ID, CREATE_TIME);
CREATE INDEX IDX_ACCESS_RESULT_SESSION_ID_AND_URL ON ACCESS_RESULT (SESSION_ID, URL);
CREATE INDEX IDX_ACCESS_RESULT_SESSION_ID ON ACCESS_RESULT (SESSION_ID);
CREATE INDEX IDX_ACCESS_RESULT_URL_AND_TIME ON ACCESS_RESULT (URL, CREATE_TIME);
CREATE INDEX IDX_URL_FILTER_SESSION_ID_AND_FILTER_TYPE ON URL_FILTER (SESSION_ID, FILTER_TYPE);

View file

@ -40,9 +40,9 @@ public class Constants extends CoreLibConstants {
public static final String FALSE = "false";
public static final String T = "T";
public static final Boolean T = true;
public static final String F = "F";
public static final Boolean F = false;
public static final String ON = "on";
@ -238,6 +238,14 @@ public class Constants extends CoreLibConstants {
public static final String SEARCH_LOG_ACCESS_TYPE = "searchLogAccessType";
public static final String SEARCH_LOG_ACCESS_TYPE_JSON = "json";
public static final String SEARCH_LOG_ACCESS_TYPE_XML = "xml";
public static final String SEARCH_LOG_ACCESS_TYPE_WEB = "web";
public static final String SEARCH_LOG_ACCESS_TYPE_OTHER = "other";
public static final String RESULTS_PER_PAGE = "resultsPerPage";
public static final String USER_CODE = "userCode";
@ -309,4 +317,12 @@ public class Constants extends CoreLibConstants {
public static final int DEFAULT_START_COUNT = 0;
public static final String PROCESS_TYPE_CRAWLING = "C";
public static final String PROCESS_TYPE_DISPLAYING = "D";
public static final String PROCESS_TYPE_BOTH = "B";
public static final long ONE_DAY_IN_MILLIS = 24L * 60L * 60L * 1000L;
}

View file

@ -38,7 +38,6 @@ import org.codelibs.fess.api.WebApiManager;
import org.codelibs.fess.api.WebApiRequest;
import org.codelibs.fess.api.WebApiResponse;
import org.codelibs.fess.client.FessEsClient;
import org.codelibs.fess.db.allcommon.CDef;
import org.codelibs.fess.entity.PingResponse;
import org.codelibs.fess.util.ComponentUtil;
import org.codelibs.fess.util.FacetResponse;
@ -124,7 +123,7 @@ public class JsonApiManager extends BaseApiManager implements WebApiManager {
String errMsg = StringUtil.EMPTY;
String query = null;
final StringBuilder buf = new StringBuilder(1000);
request.setAttribute(Constants.SEARCH_LOG_ACCESS_TYPE, CDef.AccessType.Json);
request.setAttribute(Constants.SEARCH_LOG_ACCESS_TYPE, Constants.SEARCH_LOG_ACCESS_TYPE_JSON);
final String queryId = request.getParameter("queryId");
try {
chain.doFilter(new WebApiRequest(request, SEARCH_API), new WebApiResponse(response));

View file

@ -36,7 +36,6 @@ import org.codelibs.fess.api.WebApiManager;
import org.codelibs.fess.api.WebApiRequest;
import org.codelibs.fess.api.WebApiResponse;
import org.codelibs.fess.client.FessEsClient;
import org.codelibs.fess.db.allcommon.CDef;
import org.codelibs.fess.entity.PingResponse;
import org.codelibs.fess.util.ComponentUtil;
import org.codelibs.fess.util.FacetResponse;
@ -112,7 +111,7 @@ public class XmlApiManager extends BaseApiManager implements WebApiManager {
String errMsg = StringUtil.EMPTY;
final StringBuilder buf = new StringBuilder(1000);
String query = null;
request.setAttribute(Constants.SEARCH_LOG_ACCESS_TYPE, CDef.AccessType.Xml);
request.setAttribute(Constants.SEARCH_LOG_ACCESS_TYPE, Constants.SEARCH_LOG_ACCESS_TYPE_XML);
final String queryId = request.getParameter("queryId");
try {
chain.doFilter(new WebApiRequest(request, SEARCH_API), new WebApiResponse(response));

View file

@ -33,6 +33,7 @@ public class LocalDateTimeConverter implements Converter {
this.pattern = pattern;
}
@Override
public Object getAsObject(String value) {
if (StringUtil.isEmpty(value)) {
return null;
@ -40,10 +41,12 @@ public class LocalDateTimeConverter implements Converter {
return LocalDateTime.parse(value, DateTimeFormatter.ofPattern(pattern));
}
@Override
public String getAsString(Object value) {
return ((LocalDateTime) value).format(DateTimeFormatter.ofPattern(pattern));
}
@Override
public boolean isTarget(Class clazz) {
return clazz == LocalDateTime.class;
}

View file

@ -25,7 +25,12 @@ import org.codelibs.fess.entity.SearchQuery.SortField;
import org.codelibs.fess.solr.FessSolrQueryException;
import org.codelibs.fess.util.ComponentUtil;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionFuture;
import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.ActionRequest;
import org.elasticsearch.action.ActionRequestBuilder;
import org.elasticsearch.action.ActionResponse;
import org.elasticsearch.action.ShardOperationFailedException;
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
import org.elasticsearch.action.admin.indices.create.CreateIndexResponse;
@ -34,13 +39,78 @@ import org.elasticsearch.action.admin.indices.mapping.get.GetMappingsResponse;
import org.elasticsearch.action.admin.indices.mapping.put.PutMappingResponse;
import org.elasticsearch.action.admin.indices.optimize.OptimizeResponse;
import org.elasticsearch.action.admin.indices.refresh.RefreshResponse;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkRequestBuilder;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.count.CountRequest;
import org.elasticsearch.action.count.CountRequestBuilder;
import org.elasticsearch.action.count.CountResponse;
import org.elasticsearch.action.delete.DeleteRequest;
import org.elasticsearch.action.delete.DeleteRequestBuilder;
import org.elasticsearch.action.delete.DeleteResponse;
import org.elasticsearch.action.deletebyquery.DeleteByQueryRequest;
import org.elasticsearch.action.deletebyquery.DeleteByQueryRequestBuilder;
import org.elasticsearch.action.deletebyquery.DeleteByQueryResponse;
import org.elasticsearch.action.exists.ExistsRequest;
import org.elasticsearch.action.exists.ExistsRequestBuilder;
import org.elasticsearch.action.exists.ExistsResponse;
import org.elasticsearch.action.explain.ExplainRequest;
import org.elasticsearch.action.explain.ExplainRequestBuilder;
import org.elasticsearch.action.explain.ExplainResponse;
import org.elasticsearch.action.fieldstats.FieldStatsRequest;
import org.elasticsearch.action.fieldstats.FieldStatsRequestBuilder;
import org.elasticsearch.action.fieldstats.FieldStatsResponse;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.get.GetRequestBuilder;
import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.action.get.MultiGetRequest;
import org.elasticsearch.action.get.MultiGetRequestBuilder;
import org.elasticsearch.action.get.MultiGetResponse;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.action.index.IndexRequest.OpType;
import org.elasticsearch.action.index.IndexRequestBuilder;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.indexedscripts.delete.DeleteIndexedScriptRequest;
import org.elasticsearch.action.indexedscripts.delete.DeleteIndexedScriptRequestBuilder;
import org.elasticsearch.action.indexedscripts.delete.DeleteIndexedScriptResponse;
import org.elasticsearch.action.indexedscripts.get.GetIndexedScriptRequest;
import org.elasticsearch.action.indexedscripts.get.GetIndexedScriptRequestBuilder;
import org.elasticsearch.action.indexedscripts.get.GetIndexedScriptResponse;
import org.elasticsearch.action.indexedscripts.put.PutIndexedScriptRequest;
import org.elasticsearch.action.indexedscripts.put.PutIndexedScriptRequestBuilder;
import org.elasticsearch.action.indexedscripts.put.PutIndexedScriptResponse;
import org.elasticsearch.action.mlt.MoreLikeThisRequest;
import org.elasticsearch.action.mlt.MoreLikeThisRequestBuilder;
import org.elasticsearch.action.percolate.MultiPercolateRequest;
import org.elasticsearch.action.percolate.MultiPercolateRequestBuilder;
import org.elasticsearch.action.percolate.MultiPercolateResponse;
import org.elasticsearch.action.percolate.PercolateRequest;
import org.elasticsearch.action.percolate.PercolateRequestBuilder;
import org.elasticsearch.action.percolate.PercolateResponse;
import org.elasticsearch.action.search.ClearScrollRequest;
import org.elasticsearch.action.search.ClearScrollRequestBuilder;
import org.elasticsearch.action.search.ClearScrollResponse;
import org.elasticsearch.action.search.MultiSearchRequest;
import org.elasticsearch.action.search.MultiSearchRequestBuilder;
import org.elasticsearch.action.search.MultiSearchResponse;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.action.search.SearchScrollRequest;
import org.elasticsearch.action.search.SearchScrollRequestBuilder;
import org.elasticsearch.action.suggest.SuggestRequest;
import org.elasticsearch.action.suggest.SuggestRequestBuilder;
import org.elasticsearch.action.suggest.SuggestResponse;
import org.elasticsearch.action.termvector.MultiTermVectorsRequest;
import org.elasticsearch.action.termvector.MultiTermVectorsRequestBuilder;
import org.elasticsearch.action.termvector.MultiTermVectorsResponse;
import org.elasticsearch.action.termvector.TermVectorRequest;
import org.elasticsearch.action.termvector.TermVectorRequestBuilder;
import org.elasticsearch.action.termvector.TermVectorResponse;
import org.elasticsearch.action.update.UpdateRequest;
import org.elasticsearch.action.update.UpdateRequestBuilder;
import org.elasticsearch.action.update.UpdateResponse;
import org.elasticsearch.client.AdminClient;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.cluster.metadata.MappingMetaData;
@ -64,6 +134,7 @@ import org.elasticsearch.search.aggregations.bucket.terms.TermsBuilder;
import org.elasticsearch.search.sort.FieldSortBuilder;
import org.elasticsearch.search.sort.SortBuilders;
import org.elasticsearch.search.sort.SortOrder;
import org.elasticsearch.threadpool.ThreadPool;
import org.seasar.framework.container.annotation.tiger.DestroyMethod;
import org.seasar.framework.container.annotation.tiger.InitMethod;
import org.slf4j.Logger;
@ -71,7 +142,7 @@ import org.slf4j.LoggerFactory;
import com.google.common.io.BaseEncoding;
public class FessEsClient {
public class FessEsClient implements Client {
private static final Logger logger = LoggerFactory.getLogger(FessEsClient.class);
protected ElasticsearchClusterRunner runner;
@ -246,6 +317,7 @@ public class FessEsClient {
}
}
@Override
@DestroyMethod
public void close() {
try {
@ -640,4 +712,446 @@ public class FessEsClient {
T build(SearchResponse response, SearchHit hit);
}
//
// Elasticsearch Client
//
@Override
public <Request extends ActionRequest, Response extends ActionResponse, RequestBuilder extends ActionRequestBuilder<Request, Response, RequestBuilder, Client>> ActionFuture<Response> execute(
Action<Request, Response, RequestBuilder, Client> action, Request request) {
return client.execute(action, request);
}
@Override
public <Request extends ActionRequest, Response extends ActionResponse, RequestBuilder extends ActionRequestBuilder<Request, Response, RequestBuilder, Client>> void execute(
Action<Request, Response, RequestBuilder, Client> action, Request request, ActionListener<Response> listener) {
client.execute(action, request, listener);
}
@Override
public <Request extends ActionRequest, Response extends ActionResponse, RequestBuilder extends ActionRequestBuilder<Request, Response, RequestBuilder, Client>> RequestBuilder prepareExecute(
Action<Request, Response, RequestBuilder, Client> action) {
return client.prepareExecute(action);
}
@Override
public ThreadPool threadPool() {
return client.threadPool();
}
@Override
public AdminClient admin() {
return client.admin();
}
@Override
public ActionFuture<IndexResponse> index(IndexRequest request) {
return client.index(request);
}
@Override
public void index(IndexRequest request, ActionListener<IndexResponse> listener) {
client.index(request, listener);
}
@Override
public IndexRequestBuilder prepareIndex() {
return client.prepareIndex();
}
@Override
public ActionFuture<UpdateResponse> update(UpdateRequest request) {
return client.update(request);
}
@Override
public void update(UpdateRequest request, ActionListener<UpdateResponse> listener) {
client.update(request, listener);
}
@Override
public UpdateRequestBuilder prepareUpdate() {
return client.prepareUpdate();
}
@Override
public UpdateRequestBuilder prepareUpdate(String index, String type, String id) {
return client.prepareUpdate(index, type, id);
}
@Override
public IndexRequestBuilder prepareIndex(String index, String type) {
return client.prepareIndex(index, type);
}
@Override
public IndexRequestBuilder prepareIndex(String index, String type, String id) {
return client.prepareIndex(index, type, id);
}
@Override
public ActionFuture<DeleteResponse> delete(DeleteRequest request) {
return client.delete(request);
}
@Override
public void delete(DeleteRequest request, ActionListener<DeleteResponse> listener) {
client.delete(request, listener);
}
@Override
public DeleteRequestBuilder prepareDelete() {
return client.prepareDelete();
}
@Override
public DeleteRequestBuilder prepareDelete(String index, String type, String id) {
return client.prepareDelete(index, type, id);
}
@Override
public ActionFuture<BulkResponse> bulk(BulkRequest request) {
return client.bulk(request);
}
@Override
public void bulk(BulkRequest request, ActionListener<BulkResponse> listener) {
client.bulk(request, listener);
}
@Override
public BulkRequestBuilder prepareBulk() {
return client.prepareBulk();
}
@Override
public ActionFuture<DeleteByQueryResponse> deleteByQuery(DeleteByQueryRequest request) {
return client.deleteByQuery(request);
}
@Override
public void deleteByQuery(DeleteByQueryRequest request, ActionListener<DeleteByQueryResponse> listener) {
client.deleteByQuery(request, listener);
}
@Override
public DeleteByQueryRequestBuilder prepareDeleteByQuery(String... indices) {
return client.prepareDeleteByQuery(indices);
}
@Override
public ActionFuture<GetResponse> get(GetRequest request) {
return client.get(request);
}
@Override
public void get(GetRequest request, ActionListener<GetResponse> listener) {
client.get(request, listener);
}
@Override
public GetRequestBuilder prepareGet() {
return client.prepareGet();
}
@Override
public GetRequestBuilder prepareGet(String index, String type, String id) {
return client.prepareGet(index, type, id);
}
@Override
public PutIndexedScriptRequestBuilder preparePutIndexedScript() {
return client.preparePutIndexedScript();
}
@Override
public PutIndexedScriptRequestBuilder preparePutIndexedScript(String scriptLang, String id, String source) {
return client.preparePutIndexedScript(scriptLang, id, source);
}
@Override
public void deleteIndexedScript(DeleteIndexedScriptRequest request, ActionListener<DeleteIndexedScriptResponse> listener) {
client.deleteIndexedScript(request, listener);
}
@Override
public ActionFuture<DeleteIndexedScriptResponse> deleteIndexedScript(DeleteIndexedScriptRequest request) {
return client.deleteIndexedScript(request);
}
@Override
public DeleteIndexedScriptRequestBuilder prepareDeleteIndexedScript() {
return client.prepareDeleteIndexedScript();
}
@Override
public DeleteIndexedScriptRequestBuilder prepareDeleteIndexedScript(String scriptLang, String id) {
return client.prepareDeleteIndexedScript(scriptLang, id);
}
@Override
public void putIndexedScript(PutIndexedScriptRequest request, ActionListener<PutIndexedScriptResponse> listener) {
client.putIndexedScript(request, listener);
}
@Override
public ActionFuture<PutIndexedScriptResponse> putIndexedScript(PutIndexedScriptRequest request) {
return client.putIndexedScript(request);
}
@Override
public GetIndexedScriptRequestBuilder prepareGetIndexedScript() {
return client.prepareGetIndexedScript();
}
@Override
public GetIndexedScriptRequestBuilder prepareGetIndexedScript(String scriptLang, String id) {
return client.prepareGetIndexedScript(scriptLang, id);
}
@Override
public void getIndexedScript(GetIndexedScriptRequest request, ActionListener<GetIndexedScriptResponse> listener) {
client.getIndexedScript(request, listener);
}
@Override
public ActionFuture<GetIndexedScriptResponse> getIndexedScript(GetIndexedScriptRequest request) {
return client.getIndexedScript(request);
}
@Override
public ActionFuture<MultiGetResponse> multiGet(MultiGetRequest request) {
return client.multiGet(request);
}
@Override
public void multiGet(MultiGetRequest request, ActionListener<MultiGetResponse> listener) {
client.multiGet(request, listener);
}
@Override
public MultiGetRequestBuilder prepareMultiGet() {
return client.prepareMultiGet();
}
@Override
public ActionFuture<CountResponse> count(CountRequest request) {
return client.count(request);
}
@Override
public void count(CountRequest request, ActionListener<CountResponse> listener) {
client.count(request, listener);
}
@Override
public CountRequestBuilder prepareCount(String... indices) {
return client.prepareCount(indices);
}
@Override
public ActionFuture<ExistsResponse> exists(ExistsRequest request) {
return client.exists(request);
}
@Override
public void exists(ExistsRequest request, ActionListener<ExistsResponse> listener) {
client.exists(request, listener);
}
@Override
public ExistsRequestBuilder prepareExists(String... indices) {
return client.prepareExists(indices);
}
@Override
public ActionFuture<SuggestResponse> suggest(SuggestRequest request) {
return client.suggest(request);
}
@Override
public void suggest(SuggestRequest request, ActionListener<SuggestResponse> listener) {
client.suggest(request, listener);
}
@Override
public SuggestRequestBuilder prepareSuggest(String... indices) {
return client.prepareSuggest(indices);
}
@Override
public ActionFuture<SearchResponse> search(SearchRequest request) {
return client.search(request);
}
@Override
public void search(SearchRequest request, ActionListener<SearchResponse> listener) {
client.search(request, listener);
}
@Override
public SearchRequestBuilder prepareSearch(String... indices) {
return client.prepareSearch(indices);
}
@Override
public ActionFuture<SearchResponse> searchScroll(SearchScrollRequest request) {
return client.searchScroll(request);
}
@Override
public void searchScroll(SearchScrollRequest request, ActionListener<SearchResponse> listener) {
client.searchScroll(request, listener);
}
@Override
public SearchScrollRequestBuilder prepareSearchScroll(String scrollId) {
return client.prepareSearchScroll(scrollId);
}
@Override
public ActionFuture<MultiSearchResponse> multiSearch(MultiSearchRequest request) {
return client.multiSearch(request);
}
@Override
public void multiSearch(MultiSearchRequest request, ActionListener<MultiSearchResponse> listener) {
client.multiSearch(request, listener);
}
@Override
public MultiSearchRequestBuilder prepareMultiSearch() {
return client.prepareMultiSearch();
}
@Override
public ActionFuture<SearchResponse> moreLikeThis(MoreLikeThisRequest request) {
return client.moreLikeThis(request);
}
@Override
public void moreLikeThis(MoreLikeThisRequest request, ActionListener<SearchResponse> listener) {
client.moreLikeThis(request, listener);
}
@Override
public MoreLikeThisRequestBuilder prepareMoreLikeThis(String index, String type, String id) {
return client.prepareMoreLikeThis(index, type, id);
}
@Override
public ActionFuture<TermVectorResponse> termVector(TermVectorRequest request) {
return client.termVector(request);
}
@Override
public void termVector(TermVectorRequest request, ActionListener<TermVectorResponse> listener) {
client.termVector(request, listener);
}
@Override
public TermVectorRequestBuilder prepareTermVector() {
return client.prepareTermVector();
}
@Override
public TermVectorRequestBuilder prepareTermVector(String index, String type, String id) {
return client.prepareTermVector(index, type, id);
}
@Override
public ActionFuture<MultiTermVectorsResponse> multiTermVectors(MultiTermVectorsRequest request) {
return client.multiTermVectors(request);
}
@Override
public void multiTermVectors(MultiTermVectorsRequest request, ActionListener<MultiTermVectorsResponse> listener) {
client.multiTermVectors(request, listener);
}
@Override
public MultiTermVectorsRequestBuilder prepareMultiTermVectors() {
return client.prepareMultiTermVectors();
}
@Override
public ActionFuture<PercolateResponse> percolate(PercolateRequest request) {
return client.percolate(request);
}
@Override
public void percolate(PercolateRequest request, ActionListener<PercolateResponse> listener) {
client.percolate(request, listener);
}
@Override
public PercolateRequestBuilder preparePercolate() {
return client.preparePercolate();
}
@Override
public ActionFuture<MultiPercolateResponse> multiPercolate(MultiPercolateRequest request) {
return client.multiPercolate(request);
}
@Override
public void multiPercolate(MultiPercolateRequest request, ActionListener<MultiPercolateResponse> listener) {
client.multiPercolate(request, listener);
}
@Override
public MultiPercolateRequestBuilder prepareMultiPercolate() {
return client.prepareMultiPercolate();
}
@Override
public ExplainRequestBuilder prepareExplain(String index, String type, String id) {
return client.prepareExplain(index, type, id);
}
@Override
public ActionFuture<ExplainResponse> explain(ExplainRequest request) {
return client.explain(request);
}
@Override
public void explain(ExplainRequest request, ActionListener<ExplainResponse> listener) {
client.explain(request, listener);
}
@Override
public ClearScrollRequestBuilder prepareClearScroll() {
return client.prepareClearScroll();
}
@Override
public ActionFuture<ClearScrollResponse> clearScroll(ClearScrollRequest request) {
return client.clearScroll(request);
}
@Override
public void clearScroll(ClearScrollRequest request, ActionListener<ClearScrollResponse> listener) {
client.clearScroll(request, listener);
}
@Override
public FieldStatsRequestBuilder prepareFieldStats() {
return client.prepareFieldStats();
}
@Override
public ActionFuture<FieldStatsResponse> fieldStats(FieldStatsRequest request) {
return client.fieldStats(request);
}
@Override
public void fieldStats(FieldStatsRequest request, ActionListener<FieldStatsResponse> listener) {
client.fieldStats(request, listener);
}
@Override
public Settings settings() {
return client.settings();
}
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsBoostDocumentRuleBhv;
/**
* The behavior of BOOST_DOCUMENT_RULE.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class BoostDocumentRuleBhv extends BsBoostDocumentRuleBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsClickLogBhv;
/**
* The behavior of CLICK_LOG.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class ClickLogBhv extends BsClickLogBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsCrawlingSessionBhv;
/**
* The behavior of CRAWLING_SESSION.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class CrawlingSessionBhv extends BsCrawlingSessionBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsCrawlingSessionInfoBhv;
/**
* The behavior of CRAWLING_SESSION_INFO.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class CrawlingSessionInfoBhv extends BsCrawlingSessionInfoBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsDataConfigToLabelTypeMappingBhv;
/**
* The behavior of DATA_CONFIG_TO_LABEL_TYPE_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class DataConfigToLabelTypeMappingBhv extends BsDataConfigToLabelTypeMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsDataConfigToRoleTypeMappingBhv;
/**
* The behavior of DATA_CONFIG_TO_ROLE_TYPE_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class DataConfigToRoleTypeMappingBhv extends BsDataConfigToRoleTypeMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsDataCrawlingConfigBhv;
/**
* The behavior of DATA_CRAWLING_CONFIG.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class DataCrawlingConfigBhv extends BsDataCrawlingConfigBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFailureUrlBhv;
/**
* The behavior of FAILURE_URL.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FailureUrlBhv extends BsFailureUrlBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFavoriteLogBhv;
/**
* The behavior of FAVORITE_LOG.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FavoriteLogBhv extends BsFavoriteLogBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFileAuthenticationBhv;
/**
* The behavior of FILE_AUTHENTICATION.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FileAuthenticationBhv extends BsFileAuthenticationBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFileConfigToLabelTypeMappingBhv;
/**
* The behavior of FILE_CONFIG_TO_LABEL_TYPE_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FileConfigToLabelTypeMappingBhv extends BsFileConfigToLabelTypeMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFileConfigToRoleTypeMappingBhv;
/**
* The behavior of FILE_CONFIG_TO_ROLE_TYPE_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FileConfigToRoleTypeMappingBhv extends BsFileConfigToRoleTypeMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsFileCrawlingConfigBhv;
/**
* The behavior of FILE_CRAWLING_CONFIG.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class FileCrawlingConfigBhv extends BsFileCrawlingConfigBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsJobLogBhv;
/**
* The behavior of JOB_LOG.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class JobLogBhv extends BsJobLogBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsKeyMatchBhv;
/**
* The behavior of KEY_MATCH.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class KeyMatchBhv extends BsKeyMatchBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsLabelTypeBhv;
/**
* The behavior of LABEL_TYPE.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class LabelTypeBhv extends BsLabelTypeBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsLabelTypeToRoleTypeMappingBhv;
/**
* The behavior of LABEL_TYPE_TO_ROLE_TYPE_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class LabelTypeToRoleTypeMappingBhv extends BsLabelTypeToRoleTypeMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsOverlappingHostBhv;
/**
* The behavior of OVERLAPPING_HOST.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class OverlappingHostBhv extends BsOverlappingHostBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsPathMappingBhv;
/**
* The behavior of PATH_MAPPING.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class PathMappingBhv extends BsPathMappingBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsRequestHeaderBhv;
/**
* The behavior of REQUEST_HEADER.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class RequestHeaderBhv extends BsRequestHeaderBhv {
}

View file

@ -1,30 +0,0 @@
/*
* Copyright 2009-2015 the CodeLibs Project and the Others.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
* either express or implied. See the License for the specific language
* governing permissions and limitations under the License.
*/
package org.codelibs.fess.db.exbhv;
import org.codelibs.fess.db.bsbhv.BsRoleTypeBhv;
/**
* The behavior of ROLE_TYPE.
* <p>
* You can implement your original methods here.
* This class remains when re-generating.
* </p>
* @author DBFlute(AutoGenerator)
*/
public class RoleTypeBhv extends BsRoleTypeBhv {
}

Some files were not shown because too many files have changed in this diff Show more