Metabase Centos İnstall Error- PLEASE HELP (Turkish locale)

[root@localhost ~]# java -jar metabase.jar
02-13 14:02:34 INFO metabase.util :: Loading Metabase…
02-13 14:02:41 INFO util.encryption :: DB details encryption is DISABLED for this Metabase instance. :unlock:
See http://www.metabase.com/docs/latest/operations-guide/start.html#encrypting-your-database-connection-details-at-rest for more information.
02-13 14:02:58 INFO metabase.core :: Starting Metabase in STANDALONE mode
02-13 14:02:58 INFO metabase.core :: Launching Embedded Jetty Webserver with config:
{:port 3000}

02-13 14:02:58 INFO metabase.core :: Starting Metabase version v0.28.1 (fe0c411 release-0.28.1) …
02-13 14:02:58 INFO metabase.core :: System timezone is ‘Europe/Istanbul’ …
02-13 14:03:00 INFO metabase.core :: Setting up and migrating Metabase DB. Please sit tight, this may take a minute…
02-13 14:03:00 INFO metabase.db :: Verifying h2 Database Connection …
02-13 14:03:01 INFO metabase.db :: Verify Database Connection … :white_check_mark:
02-13 14:03:01 INFO metabase.db :: Running Database Migrations…
02-13 14:03:01 INFO metabase.db :: Setting up Liquibase…
02-13 14:03:01 INFO metabase.db :: Liquibase is ready.
02-13 14:03:01 INFO metabase.db :: Checking if Database has unrun migrations…
liquibase.exception.DatabaseException: Unknown data type: “İNT”; SQL statement:
CREATE TABLE PUBLIC.DATABASECHANGELOG (ID VARCHAR(255) NOT NULL, AUTHOR VARCHAR(255) NOT NULL, FILENAME VARCHAR(255) NOT NULL, DATEEXECUTED TIMESTAMP NOT NULL, ORDEREXECUTED İNT NOT NULL, EXECTYPE VARCHAR(10) NOT NULL, MD5SUM VARCHAR(35), DESCRIPTION VARCHAR(255), COMMENTS VARCHAR(255), TAG VARCHAR(255), LIQUIBASE VARCHAR(20), CONTEXTS VARCHAR(255), LABELS VARCHAR(255), DEPLOYMENT_ID VARCHAR(10)) [50004-194] [Failed SQL: CREATE TABLE PUBLIC.DATABASECHANGELOG (ID VARCHAR(255) NOT NULL, AUTHOR VARCHAR(255) NOT NULL, FILENAME VARCHAR(255) NOT NULL, DATEEXECUTED TIMESTAMP NOT NULL, ORDEREXECUTED İNT NOT NULL, EXECTYPE VARCHAR(10) NOT NULL, MD5SUM VARCHAR(35), DESCRIPTION VARCHAR(255), COMMENTS VARCHAR(255), TAG VARCHAR(255), LIQUIBASE VARCHAR(20), CONTEXTS VARCHAR(255), LABELS VARCHAR(255), DEPLOYMENT_ID VARCHAR(10))]
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:309)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:55)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:113)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:103)
at liquibase.changelog.StandardChangeLogHistoryService.init(StandardChangeLogHistoryService.java:240)
at liquibase.Liquibase.checkLiquibaseTables(Liquibase.java:1124)
at liquibase.Liquibase.listUnrunChangeSets(Liquibase.java:1186)
at liquibase.Liquibase.listUnrunChangeSets(Liquibase.java:1176)
at liquibase.Liquibase.listUnrunChangeSets(Liquibase.java:1172)
at metabase.db$has_unrun_migrations_QMARK_.invokeStatic(db.clj:138)
at metabase.db$has_unrun_migrations_QMARK_.invoke(db.clj:131)
at metabase.db$migrate_up_if_needed_BANG_.invokeStatic(db.clj:164)
at metabase.db$migrate_up_if_needed_BANG_.invoke(db.clj:155)
at metabase.db$migrate_BANG_$fn__20229.invoke(db.clj:258)
at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:741)
at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:711)
at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:776)
at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:711)
at clojure.java.jdbc$db_transaction_STAR_.invokeStatic(jdbc.clj:724)
at clojure.java.jdbc$db_transaction_STAR_.invoke(jdbc.clj:711)
at metabase.db$migrate_BANG_.invokeStatic(db.clj:246)
at metabase.db$migrate_BANG_.invoke(db.clj:227)
at metabase.db$run_schema_migrations_BANG_.invokeStatic(db.clj:381)
at metabase.db$run_schema_migrations_BANG_.invoke(db.clj:376)
at metabase.db$setup_db_BANG_.invokeStatic(db.clj:399)
at metabase.db$setup_db_BANG_.doInvoke(db.clj:392)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at metabase.core$init_BANG_.invokeStatic(core.clj:140)
at metabase.core$init_BANG_.invoke(core.clj:119)
at metabase.core$start_normally.invokeStatic(core.clj:248)
at metabase.core$start_normally.invoke(core.clj:241)
at metabase.core$_main.invokeStatic(core.clj:269)
at metabase.core$_main.doInvoke(core.clj:264)
at clojure.lang.RestFn.invoke(RestFn.java:397)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at metabase.core.main(Unknown Source)
Caused by: org.h2.jdbc.JdbcSQLException: Unknown data type: “İNT”; SQL statement:
CREATE TABLE PUBLIC.DATABASECHANGELOG (ID VARCHAR(255) NOT NULL, AUTHOR VARCHAR(255) NOT NULL, FILENAME VARCHAR(255) NOT NULL, DATEEXECUTED TIMESTAMP NOT NULL, ORDEREXECUTED İNT NOT NULL, EXECTYPE VARCHAR(10) NOT NULL, MD5SUM VARCHAR(35), DESCRIPTION VARCHAR(255), COMMENTS VARCHAR(255), TAG VARCHAR(255), LIQUIBASE VARCHAR(20), CONTEXTS VARCHAR(255), LABELS VARCHAR(255), DEPLOYMENT_ID VARCHAR(10)) [50004-194]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.command.Parser.parseColumnWithType(Parser.java:4144)
at org.h2.command.Parser.parseColumnForTable(Parser.java:3994)
at org.h2.command.Parser.parseCreateTable(Parser.java:6053)
at org.h2.command.Parser.parseCreate(Parser.java:4302)
at org.h2.command.Parser.parsePrepared(Parser.java:364)
at org.h2.command.Parser.parse(Parser.java:319)
at org.h2.command.Parser.parse(Parser.java:291)
at org.h2.command.Parser.prepareCommand(Parser.java:256)
at org.h2.engine.Session.prepareLocal(Session.java:564)
at org.h2.engine.Session.prepareCommand(Session.java:505)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1204)
at org.h2.jdbc.JdbcStatement.executeInternal(JdbcStatement.java:170)
at org.h2.jdbc.JdbcStatement.execute(JdbcStatement.java:158)
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:307)
… 36 more
02-13 14:03:03 ERROR metabase.core :: Metabase Initialization FAILED: Unknown data type: “İNT”; SQL statement:
CREATE TABLE PUBLIC.DATABASECHANGELOG (ID VARCHAR(255) NOT NULL, AUTHOR VARCHAR(255) NOT NULL, FILENAME VARCHAR(255) NOT NULL, DATEEXECUTED TIMESTAMP NOT NULL, ORDEREXECUTED İNT NOT NULL, EXECTYPE VARCHAR(10) NOT NULL, MD5SUM VARCHAR(35), DESCRIPTION VARCHAR(255), COMMENTS VARCHAR(255), TAG VARCHAR(255), LIQUIBASE VARCHAR(20), CONTEXTS VARCHAR(255), LABELS VARCHAR(255), DEPLOYMENT_ID VARCHAR(10)) [50004-194] [Failed SQL: CREATE TABLE PUBLIC.DATABASECHANGELOG (ID VARCHAR(255) NOT NULL, AUTHOR VARCHAR(255) NOT NULL, FILENAME VARCHAR(255) NOT NULL, DATEEXECUTED TIMESTAMP NOT NULL, ORDEREXECUTED İNT NOT NULL, EXECTYPE VARCHAR(10) NOT NULL, MD5SUM VARCHAR(35), DESCRIPTION VARCHAR(255), COMMENTS VARCHAR(255), TAG VARCHAR(255), LIQUIBASE VARCHAR(20), CONTEXTS VARCHAR(255), LABELS VARCHAR(255), DEPLOYMENT_ID VARCHAR(10))]
02-13 14:03:03 INFO metabase.core :: Metabase Shutting Down …
02-13 14:03:03 INFO metabase.core :: Metabase Shutdown COMPLETE

what can i do? Anyone can help me please

Anyone can help?

who is can help please…

You have to help others to help you.
upgrade or fresh install?

1 Like

Agree with what Andrew said. That you’re sharing a full log is at least helpful - keep doing that when you want others to help. From that at least I could see:

  • Exactly which version of Metabase you were running - and the full error you got

But in addition also demonstrate that you have made an effort to:

  • more precisely what version of Centos (and Java) you are running
  • as Andrew said: Were you running a fresh or upgraded version?
  • what happened before the log: was this first run or did it work earlier
  • Did you somehow try to redo fresh - or did you try to install on another machine?

If you do that I’m sure people will be more willing to help next time :slight_smile:

I’m unable to reproduce with a fresh install of Centos in Docker on my Windows machine (using this image as of today):

C:\Users\jornh>docker run -ti nimmis/java-centos:openjdk-8-jdk
Unable to find image 'nimmis/java-centos:openjdk-8-jdk' locally
openjdk-8-jdk: Pulling from nimmis/java-centos
af4b0a2388c6: Already exists
6baf3c17164f: Pull complete
Digest: sha256:6059c58fe11b450c44dffb72b7c49694563a93234206d51f7a006566b2f60959
Status: Downloaded newer image for nimmis/java-centos:openjdk-8-jdk
[root@ae150a58271e ~]# ls
anaconda-ks.cfg
[root@ae150a58271e ~]# java -version
openjdk version "1.8.0_151"
OpenJDK Runtime Environment (build 1.8.0_151-b12)
OpenJDK 64-Bit Server VM (build 25.151-b12, mixed mode)
[root@ae150a58271e ~]# wget http://downloads.metabase.com/v0.28.1/metabase.jar
--2018-02-14 19:08:01--  http://downloads.metabase.com/v0.28.1/metabase.jar
Resolving downloads.metabase.com (downloads.metabase.com)... 52.216.226.242
Connecting to downloads.metabase.com (downloads.metabase.com)|52.216.226.242|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 112773736 (108M) [application/java-archive]
Saving to: 'metabase.jar'

100%[==============================================================================>] 112,773,736 3.25MB/s   in 3m 36s

2018-02-14 19:11:40 (509 KB/s) - 'metabase.jar' saved [112773736/112773736]

[root@ae150a58271e ~]# ls
anaconda-ks.cfg  metabase.jar
[root@ae150a58271e ~]# java -jar metabase.jar
02-14 19:12:48 INFO metabase.util :: Loading Metabase...
02-14 19:12:53 INFO util.encryption :: DB details encryption is DISABLED for this Metabase instance. ?
See http://www.metabase.com/docs/latest/operations-guide/start.html#encrypting-your-database-connection-details-at-rest for more information.
02-14 19:12:59 INFO metabase.core :: Starting Metabase in STANDALONE mode
02-14 19:12:59 INFO metabase.core :: Launching Embedded Jetty Webserver with config:
 {:port 3000}

02-14 19:12:59 INFO metabase.core :: Starting Metabase version v0.28.1 (fe0c411 release-0.28.1) ...
02-14 19:12:59 INFO metabase.core :: System timezone is 'UTC' ...
02-14 19:13:01 INFO metabase.core :: Setting up and migrating Metabase DB. Please sit tight, this may take a minute...
02-14 19:13:01 INFO metabase.db :: Verifying h2 Database Connection ...
02-14 19:13:02 INFO metabase.db :: Verify Database Connection ...  ?
02-14 19:13:02 INFO metabase.db :: Running Database Migrations...
02-14 19:13:02 INFO metabase.db :: Setting up Liquibase...
02-14 19:13:02 INFO metabase.db :: Liquibase is ready.
02-14 19:13:02 INFO metabase.db :: Checking if Database has unrun migrations...
02-14 19:13:04 INFO metabase.db :: Database has unrun migrations. Waiting for migration lock to be cleared...
02-14 19:13:04 INFO metabase.db :: Migration lock is cleared. Running migrations...
02-14 19:13:59 INFO metabase.db :: Database Migrations Current ...  ?
com.mchange.v2.cfg.DelayedLogItem [ level -> FINE, text -> "The configuration file for resource identifier 'hocon:/reference,/application,/c3p0,/' could not be found. Skipping.", exception -> null]
02-14 19:13:59 INFO db.migrations :: Running all necessary data migrations, this may take a minute.
02-14 19:13:59 INFO db.migrations :: Running data migration 'set-card-database-and-table-ids'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'set-mongodb-databases-ssl-false'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'set-default-schemas'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'set-admin-email'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'remove-database-sync-activity-entries'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'update-dashboards-to-new-grid'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'migrate-field-visibility-type'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'add-users-to-default-permissions-groups'...
02-14 19:13:59 INFO models.permissions-group :: Created magic permissions group 'All Users' (ID = 1)
02-14 19:13:59 INFO models.permissions-group :: Created magic permissions group 'Administrators' (ID = 2)
02-14 19:13:59 INFO db.migrations :: Running data migration 'add-admin-group-root-entry'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'add-databases-to-magic-permissions-groups'...
02-14 19:13:59 INFO models.permissions-group :: Created magic permissions group 'MetaBot' (ID = 3)
02-14 19:13:59 INFO db.migrations :: Running data migration 'migrate-field-types'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'fix-invalid-field-types'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'copy-site-url-setting-and-remove-trailing-slashes'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'migrate-query-executions'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'drop-old-query-execution-table'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'ensure-protocol-specified-in-site-url'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'populate-card-database-id'...
02-14 19:13:59 INFO db.migrations :: Running data migration 'migrate-humanization-setting'...
02-14 19:13:59 INFO db.migrations :: Finished running data migrations.
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.activity-feed ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.dependencies ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.driver-notifications ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.last-login ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.metabot-lifecycle ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.notifications ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.revision ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.sync-database ?
02-14 19:13:59 INFO metabase.events :: Starting events listener: metabase.events.view-log ?
02-14 19:13:59 INFO metabase.task :: Loading tasks namespace: metabase.task.cleanup-temporary-computation-job-results ?
02-14 19:13:59 INFO metabase.task :: Loading tasks namespace: metabase.task.follow-up-emails ?
02-14 19:14:00 INFO metabase.task :: Loading tasks namespace: metabase.task.send-anonymous-stats ?
02-14 19:14:00 INFO metabase.task :: Loading tasks namespace: metabase.task.send-pulses ?
02-14 19:14:00 INFO metabase.task :: Loading tasks namespace: metabase.task.sync-databases ?
02-14 19:14:00 INFO metabase.task :: Loading tasks namespace: metabase.task.upgrade-checks ?
02-14 19:14:00 INFO metabase.core :: Looks like this is a new installation ... preparing setup wizard
02-14 19:14:00 INFO metabase.core :: Please use the following url to setup your Metabase installation:

http://localhost:3000/setup/


02-14 19:14:00 INFO metabase.sample-data :: Loading sample dataset...
02-14 19:14:00 DEBUG sync.util :: Sync operations in flight: {:sync #{1}}
02-14 19:14:00 INFO sync.util :: STARTING: Sync h2 Database 1 'Sample Dataset'
02-14 19:14:00 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :sync-metadata #{1}}
02-14 19:14:00 INFO sync.util :: STARTING: Sync metadata for h2 Database 1 'Sample Dataset'
02-14 19:14:00 INFO sync-metadata.tables :: Found new tables: (Table  'PUBLIC.PRODUCTS' Table  'PUBLIC.ORDERS' Table  'PUBLIC.PEOPLE' Table  'PUBLIC.REVIEWS')
02-14 19:14:00 INFO sync-metadata.fks :: Marking foreign key from Table 2 'PUBLIC.ORDERS' Field 9 'USER_ID' -> Table 3 'PUBLIC.PEOPLE' Field 21 'ID'
02-14 19:14:00 INFO sync-metadata.fks :: Marking foreign key from Table 2 'PUBLIC.ORDERS' Field 11 'PRODUCT_ID' -> Table 1 'PUBLIC.PRODUCTS' Field 4 'ID'
02-14 19:14:00 INFO sync-metadata.fks :: Marking foreign key from Table 4 'PUBLIC.REVIEWS' Field 31 'PRODUCT_ID' -> Table 1 'PUBLIC.PRODUCTS' Field 4 'ID'
02-14 19:14:00 INFO sync.util :: FINISHED: Sync metadata for h2 Database 1 'Sample Dataset' (546 ms)
02-14 19:14:00 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :analyze #{1}}
02-14 19:14:00 INFO sync.util :: STARTING: Analyze data for h2 Database 1 'Sample Dataset'
02-14 19:14:00 INFO middleware.cache :: Using query processor cache backend: :db ?
02-14 19:14:01 DEBUG analyze.table-row-count :: Set table row count for Table 1 'PUBLIC.PRODUCTS' to 200
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 1 'EAN'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 2 'RATING'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 3 'PRICE'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 4 'ID'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 5 'TITLE'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 6 'CATEGORY'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 7 'CREATED_AT'
02-14 19:14:01 DEBUG analyze.fingerprint :: Saving fingerprint for Field 8 'VENDOR'
02-14 19:14:01 DEBUG classifiers.category :: Field 1 'EAN' has 200 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 1 'EAN': {:special_type :type/Category}
02-14 19:14:01 DEBUG classifiers.category :: Field 2 'RATING' has 23 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 2 'RATING': {:special_type :type/Category}
02-14 19:14:01 DEBUG classifiers.category :: Field 3 'PRICE' has 200 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 3 'PRICE': {:special_type :type/Category}
02-14 19:14:01 DEBUG classifiers.name :: Based on the name of Field 4 'ID', we're giving it a special type of :type/PK.
02-14 19:14:01 DEBUG classifiers.category :: Field 5 'TITLE' has 198 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 5 'TITLE': {:special_type :type/Category}
02-14 19:14:01 DEBUG classifiers.category :: Field 6 'CATEGORY' has 4 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 6 'CATEGORY': {:special_type :type/Category}
02-14 19:14:01 DEBUG classifiers.category :: Field 8 'VENDOR' has 200 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:01 DEBUG analyze.classify :: Based on classification, updating these values of Field 8 'VENDOR': {:special_type :type/Category}
02-14 19:14:01 INFO sync.analyze :: [************??????????????????????????????????????] ?   25% Analyzed Table 1 'PUBLIC.PRODUCTS'
02-14 19:14:01 DEBUG analyze.table-row-count :: Set table row count for Table 2 'PUBLIC.ORDERS' to 12805
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 9 'USER_ID'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 10 'DISCOUNT'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 11 'PRODUCT_ID'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 12 'ID'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 13 'SUBTOTAL'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 14 'QUANTITY'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 15 'CREATED_AT'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 16 'TAX'
02-14 19:14:02 DEBUG analyze.fingerprint :: Saving fingerprint for Field 17 'TOTAL'
02-14 19:14:02 DEBUG classifiers.name :: Based on the name of Field 12 'ID', we're giving it a special type of :type/PK.
02-14 19:14:02 DEBUG classifiers.category :: Field 14 'QUANTITY' has 62 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:02 DEBUG analyze.classify :: Based on classification, updating these values of Field 14 'QUANTITY': {:special_type :type/Category}
02-14 19:14:02 INFO sync.analyze :: [*************************?????????????????????????] ?   50% Analyzed Table 2 'PUBLIC.ORDERS'
02-14 19:14:02 DEBUG analyze.table-row-count :: Set table row count for Table 3 'PUBLIC.PEOPLE' to 2500
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 18 'LATITUDE'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 19 'BIRTH_DATE'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 20 'NAME'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 21 'ID'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 22 'ADDRESS'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 23 'LONGITUDE'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 24 'SOURCE'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 25 'EMAIL'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 26 'CREATED_AT'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 27 'ZIP'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 28 'STATE'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 29 'PASSWORD'
02-14 19:14:03 DEBUG analyze.fingerprint :: Saving fingerprint for Field 30 'CITY'
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 18 'LATITUDE', we're giving it a special type of :type/Latitude.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 18 'LATITUDE': {:special_type :type/Latitude}
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 20 'NAME', we're giving it a special type of :type/Name.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 20 'NAME': {:special_type :type/Name}
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 21 'ID', we're giving it a special type of :type/PK.
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 23 'LONGITUDE', we're giving it a special type of :type/Longitude.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 23 'LONGITUDE': {:special_type :type/Longitude}
02-14 19:14:03 DEBUG classifiers.category :: Field 24 'SOURCE' has 5 distinct values. Since that is less than 300, we're marking it as a category.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 24 'SOURCE': {:special_type :type/Category}
02-14 19:14:03 DEBUG classifiers.text-fingerprint :: Based on the fingerprint of Field 25 'EMAIL', we're marking it as :type/Email.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 25 'EMAIL': {:special_type :type/Email}
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 28 'STATE', we're giving it a special type of :type/State.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 28 'STATE': {:special_type :type/State}
02-14 19:14:03 DEBUG classifiers.name :: Based on the name of Field 30 'CITY', we're giving it a special type of :type/City.
02-14 19:14:03 DEBUG analyze.classify :: Based on classification, updating these values of Field 30 'CITY': {:special_type :type/City}
02-14 19:14:03 INFO sync.analyze :: [*************************************?????????????] ?   75% Analyzed Table 3 'PUBLIC.PEOPLE'
02-14 19:14:03 DEBUG analyze.table-row-count :: Set table row count for Table 4 'PUBLIC.REVIEWS' to 984
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 31 'PRODUCT_ID'
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 32 'ID'
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 33 'BODY'
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 34 'REVIEWER'
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 35 'CREATED_AT'
02-14 19:14:04 DEBUG analyze.fingerprint :: Saving fingerprint for Field 36 'RATING'
02-14 19:14:04 DEBUG classifiers.name :: Based on the name of Field 32 'ID', we're giving it a special type of :type/PK.
02-14 19:14:04 DEBUG analyze.classify :: Based on classification, updating these values of Field 33 'BODY': {:preview_display false}
02-14 19:14:04 DEBUG classifiers.name :: Based on the name of Field 36 'RATING', we're giving it a special type of :type/Category.
02-14 19:14:04 DEBUG analyze.classify :: Based on classification, updating these values of Field 36 'RATING': {:special_type :type/Category}
02-14 19:14:04 INFO sync.analyze :: [**************************************************] ?  100% Analyzed Table 4 'PUBLIC.REVIEWS'
02-14 19:14:04 INFO sync.util :: FINISHED: Analyze data for h2 Database 1 'Sample Dataset' (3 s)
02-14 19:14:04 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :cache-field-values #{1}}
02-14 19:14:04 INFO sync.util :: STARTING: Cache field values in h2 Database 1 'Sample Dataset'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 1 'EAN'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 2 'RATING'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 3 'PRICE'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 5 'TITLE'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 6 'CATEGORY'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 8 'VENDOR'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 14 'QUANTITY'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 20 'NAME'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 24 'SOURCE'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 28 'STATE'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 30 'CITY'
02-14 19:14:04 DEBUG sync.field-values :: Looking into updating FieldValues for Field 36 'RATING'
02-14 19:14:04 INFO sync.util :: FINISHED: Cache field values in h2 Database 1 'Sample Dataset' (466 ms)
02-14 19:14:04 INFO sync.util :: FINISHED: Sync h2 Database 1 'Sample Dataset' (4 s)
02-14 19:14:04 INFO metabase.core :: Metabase Initialization COMPLETE

Shuts down cleanly - and seems to be able to start again. Is your Metabase DB or something else broken?

^C02-14 19:21:12 INFO metabase.core :: Metabase Shutting Down ...
02-14 19:21:12 INFO metabase.core :: Metabase Shutdown COMPLETE
[root@ae150a58271e ~]# java -jar metabase.jar
02-14 19:21:20 INFO metabase.util :: Loading Metabase...
02-14 19:21:25 INFO util.encryption :: DB details encryption is DISABLED for this Metabase instance. ?
See http://www.metabase.com/docs/latest/operations-guide/start.html#encrypting-your-database-connection-details-at-rest for more information.
02-14 19:21:31 INFO metabase.core :: Starting Metabase in STANDALONE mode
02-14 19:21:31 INFO metabase.core :: Launching Embedded Jetty Webserver with config:
 {:port 3000}

02-14 19:21:31 INFO metabase.core :: Starting Metabase version v0.28.1 (fe0c411 release-0.28.1) ...
02-14 19:21:31 INFO metabase.core :: System timezone is 'UTC' ...
02-14 19:21:33 INFO metabase.core :: Setting up and migrating Metabase DB. Please sit tight, this may take a minute...
02-14 19:21:33 INFO metabase.db :: Verifying h2 Database Connection ...
02-14 19:21:33 INFO metabase.db :: Verify Database Connection ...  ?
02-14 19:21:33 INFO metabase.db :: Running Database Migrations...
02-14 19:21:33 INFO metabase.db :: Setting up Liquibase...
02-14 19:21:33 INFO metabase.db :: Liquibase is ready.
02-14 19:21:33 INFO metabase.db :: Checking if Database has unrun migrations...
02-14 19:21:36 INFO metabase.db :: Database Migrations Current ...  ?
com.mchange.v2.cfg.DelayedLogItem [ level -> FINE, text -> "The configuration file for resource identifier 'hocon:/reference,/application,/c3p0,/' could not be found. Skipping.", exception -> null]
02-14 19:21:36 INFO db.migrations :: Running all necessary data migrations, this may take a minute.
02-14 19:21:37 INFO db.migrations :: Finished running data migrations.
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.activity-feed ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.dependencies ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.driver-notifications ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.last-login ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.metabot-lifecycle ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.notifications ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.revision ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.sync-database ?
02-14 19:21:37 INFO metabase.events :: Starting events listener: metabase.events.view-log ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.cleanup-temporary-computation-job-results ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.follow-up-emails ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.send-anonymous-stats ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.send-pulses ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.sync-databases ?
02-14 19:21:37 INFO metabase.task :: Loading tasks namespace: metabase.task.upgrade-checks ?
02-14 19:21:37 INFO metabase.core :: Looks like this is a new installation ... preparing setup wizard
02-14 19:21:37 INFO metabase.core :: Please use the following url to setup your Metabase installation:

http://localhost:3000/setup/


02-14 19:21:37 INFO metabase.core :: Metabase Initialization COMPLETE
1 Like

Hello, Yes i agree Andrew (You have to help others to help you.) so im trying.

1-Active Hyper V (Microsoft Server 2008 R2)
2-Install Centos 7 (ON HYPER-V)
3-yum -y update
4-Install Java
5-Downloand Metabase V28 (jar)
6-Fresh Install with “java -jar metabase.jar”

and get errors from logs.

1 Like

Very good input @FATiH :+1: we are getting very much on the same page.

Seems both you and I used CentOS 7 then. As my dump above shows I choose a Docker image with OpenJDK 8 - just because jdk 8 seemed like the most common option. Could that be a significant difference between our runs?

edit: I was by the way running Docker on Win 10 enterprise. Can you somehow reproduce my run with the same Docker image? (I don’t think where you run Docker should matter).

Anything else you can think of to narrow down differences?

1 Like

Hah! Just googled for your liquibase error: https://www.google.com/search?q=liquibase.exception.DatabaseException:+Unknown+data+type:+“İNT”

That gives

Seems to match with timezone in your run. I’d say try switching you locale …

1 Like

Literally had to zoom to spot the error in your log above!

Note the different letter in INT. Most crazy error I’ve ever came across

1 Like

Time I bought a new monitor - I thought I had a duff pixel! Nice catch.

1 Like

Just lolled helplessly after that comment @AndrewMBaines

FYI of course the internet already knew …

And there’s already a bug towards Liquibase: https://liquibase.jira.com/plugins/servlet/mobile#issue/CORE-2772 which in the end points to a whole GitHub repo: “See https://github.com/aliok/liquibase-turkish-bug for the instructions about how to reproduce.
I tried with Liquibase core 3.3.2 and 3.5.1(latest). It happens on both versions.”

2 Likes

Thank you for your interest,I try the solutions, I share the experience with you.
seems to be relevant of System timezone

1 Like