[error]     at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:)
[error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.Mongo.execute(Mongo.java:)
[error] at com.mongodb.Mongo$.execute(Mongo.java:)
[error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:)
[error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:)
[error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:)
[error] ...
[info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf started
[error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf failed: com.mongodb.MongoCommandException: Command failed with error : 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : , "codeName" : "Unauthorized" }, took 0.001 sec
[error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:)
[error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:)
[error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.Mongo.execute(Mongo.java:)
[error] at com.mongodb.Mongo$.execute(Mongo.java:)
[error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:)
[error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:)
[error] at com.mongodb.spark.JavaRequiresMongoDB.setUp(JavaRequiresMongoDB.java:)
[error] ...
[error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf failed: com.mongodb.MongoCommandException: Command failed with error : 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : , "codeName" : "Unauthorized" }, took 0.001 sec
[error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:)
[error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:)
[error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.Mongo.execute(Mongo.java:)
[error] at com.mongodb.Mongo$.execute(Mongo.java:)
[error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:)
[error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:)
[error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:)
[error] ...
[info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started
[error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults failed: com.mongodb.MongoCommandException: Command failed with error : 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : , "codeName" : "Unauthorized" }, took 0.001 sec
[error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:)
[error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:)
[error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.Mongo.execute(Mongo.java:)
[error] at com.mongodb.Mongo$.execute(Mongo.java:)
[error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:)
[error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:)
[error] at com.mongodb.spark.JavaRequiresMongoDB.setUp(JavaRequiresMongoDB.java:)
[error] ...
[error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults failed: com.mongodb.MongoCommandException: Command failed with error : 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : , "codeName" : "Unauthorized" }, took 0.001 sec
[error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:)
[error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:)
[error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:)
[error] at com.mongodb.Mongo.execute(Mongo.java:)
[error] at com.mongodb.Mongo$.execute(Mongo.java:)
[error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:)
[error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:)
[error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:)
[error] ...
[info] Test run finished: failed, ignored, total, .003s
[info] HelpersSpec:
[info] Exception encountered when attempting to run a suite with class name: UDF.HelpersSpec *** ABORTED ***
[info] com.mongodb.MongoCommandException: Command failed with error : 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : , "codeName" : "Unauthorized" }
[info] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:)
[info] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:)
[info] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:)
[info] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:)
[info] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:)
[info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:)
[info] at com.mongodb.operation.DropDatabaseOperation$.call(DropDatabaseOperation.java:)
[info] ...
// :: INFO AuthConnectionSpec: Ended Test: 'AuthConnectionSpec'
[info] AuthConnectionSpec:
[info] MongoRDD
[info] - should be able to connect to an authenticated db *** FAILED ***
[info] org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
[info] org.apache.spark.SparkContext.<init>(SparkContext.scala:)
[info] com.mongodb.spark.SparkConfOverrideSpec.<init>(SparkConfOverrideSpec.scala:)
[info] sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[info] sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:)
[info] sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:)
[info] java.lang.reflect.Constructor.newInstance(Constructor.java:)
[info] java.lang.Class.newInstance(Class.java:)
[info] org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:)
[info] sbt.TestRunner.runTest$(TestFramework.scala:)
[info] sbt.TestRunner.run(TestFramework.scala:)
[info] sbt.TestFramework$$anon$$$anonfun$$init$$$$anonfun$apply$.apply(TestFramework.scala:)
[info] sbt.TestFramework$$anon$$$anonfun$$init$$$$anonfun$apply$.apply(TestFramework.scala:)
[info] sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:)
[info] sbt.TestFramework$$anon$$$anonfun$$init$$.apply(TestFramework.scala:)
[info] sbt.TestFramework$$anon$$$anonfun$$init$$.apply(TestFramework.scala:)
[info] sbt.TestFunction.apply(TestFramework.scala:)
[info] sbt.Tests$.sbt$Tests$$processRunnable$(Tests.scala:)
[info] sbt.Tests$$anonfun$makeSerial$.apply(Tests.scala:)
[info] sbt.Tests$$anonfun$makeSerial$.apply(Tests.scala:)
[info] sbt.std.Transform$$anon$$$anonfun$apply$.apply(System.scala:)
[info] at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$.apply(SparkContext.scala:)
[info] at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$.apply(SparkContext.scala:)
[info] at scala.Option.foreach(Option.scala:)
[info] at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:)
[info] at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:)
[info] at org.apache.spark.SparkContext.<init>(SparkContext.scala:)
[info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$.apply$mcV$sp(AuthConnectionSpec.scala:)
[info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$.apply(AuthConnectionSpec.scala:)
[info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$.apply(AuthConnectionSpec.scala:)
[info] at org.scalatest.Transformer$$anonfun$apply$.apply$mcV$sp(Transformer.scala:)
[info] ...
[info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] Reading scoverage measurements...
[info] Generating scoverage reports...
[info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml]
[info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml]
[info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html]
[info] Coverage reports completed
[info] All done. Coverage was [24.73%]
[info] ScalaCheck
[info] Passed: Total , Failed , Errors , Passed
[info] ScalaTest
[info] Run completed in seconds, milliseconds.
[info] Total number of tests run:
[info] Suites: completed , aborted
[info] Tests: succeeded , failed , canceled , ignored , pending
[info] *** SUITES ABORTED ***
[info] *** TEST FAILED ***
[error] Error: Total , Failed , Errors , Passed
[error] Failed tests:
[error] com.mongodb.spark.AuthConnectionSpec
[error] com.mongodb.spark.sql.MongoDataFrameReaderTest
[error] com.mongodb.spark.config.ReadConfigTest
[error] com.mongodb.spark.MongoSparkTest
[error] com.mongodb.spark.sql.MongoDataFrameWriterTest
[error] com.mongodb.spark.config.WriteConfigTest
[error] com.mongodb.spark.sql.MongoDataFrameTest
[error] com.mongodb.spark.NoSparkConfTest
[error] com.mongodb.spark.MongoConnectorTest
[error] com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest
[error] Error during tests:
[error] com.mongodb.spark.rdd.partitioner.MongoSplitVectorPartitionerSpec
[error] com.mongodb.spark.connection.DefaultMongoClientFactorySpec
[error] com.mongodb.spark.rdd.partitioner.MongoShardedPartitionerSpec
[error] com.mongodb.spark.rdd.partitioner.PartitionerHelperSpec
[error] com.mongodb.spark.rdd.partitioner.MongoSamplePartitionerSpec
[error] com.mongodb.spark.MongoConnectorSpec
[error] com.mongodb.spark.sql.MongoInferSchemaSpec
[error] com.mongodb.spark.rdd.partitioner.MongoPaginateBySizePartitionerSpec
[error] com.mongodb.spark.sql.MapFunctionsSpec
[error] com.mongodb.spark.sql.MongoRelationHelperSpec
[error] UDF.HelpersSpec
[error] com.mongodb.spark.connection.MongoClientRefCounterSpec
[error] com.mongodb.spark.sql.MongoDataFrameSpec
[error] com.mongodb.spark.MongoRDDSpec
[error] com.mongodb.spark.NoSparkConfSpec
[error] com.mongodb.spark.sql.fieldTypes.FieldTypesSpec
[error] com.mongodb.spark.connection.MongoClientCacheSpec
[error] com.mongodb.spark.rdd.partitioner.MongoPaginateByCountPartitionerSpec
[error] com.mongodb.spark.SparkConfOverrideSpec
[error] (mongo-spark-connector/test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: s, completed -- ::
[root@hadoop1 mongo-spark]# ./sbt check
[info] Loading project definition from /usr/local/mongo-spark/project
[info] Set current project to mongo-spark-connector (in build file:/usr/local/mongo-spark/)
[warn] Credentials file /root/.ivy2/.spCredentials does not exist
[success] Total time: s, completed -- ::
[info] scalastyle using config /usr/local/mongo-spark/project/scalastyle-config.xml
[info] Processed file(s)
[info] Found errors
[info] Found warnings
[info] Found infos
[info] Finished in ms
[success] created output: /usr/local/mongo-spark/target
[success] Total time: s, completed -- ::
[success] Total time: s, completed -- ::
[warn] Credentials file /root/.ivy2/.spCredentials does not exist
[info] Updating {file:/usr/local/mongo-spark/}mongo-spark-connector...
[info] Resolving org.scala-lang#scala-library;2.11. ...
[info] Formatting Scala sources {file:/usr/local/mongo-spark/}mongo-spark-connector(test) ...
[info] Resolving org.apache#apache; ...
[error] Server access Error: 连接超时 (Connection timed out) url=https://repo1.maven.org/maven2/org/apache/spark/spark-parent_2.11/2.2.0/spark-parent_2.11-2.2.0.jar
[info] Resolving jline#jline;2.12. ...
[info] Done updating.
[info] Formatting Scala sources {file:/usr/local/mongo-spark/}mongo-spark-connector(compile) ...
[info] Compiling Scala sources and Java sources to /usr/local/mongo-spark/target/scala-2.11/classes...
[info] [info] Cleaning datadir [/usr/local/mongo-spark/target/scala-2.11/scoverage-data]
[info] [info] Beginning coverage instrumentation
[info] [info] Instrumentation completed [ statements]
[info] [info] Wrote instrumentation file [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] [info] Will write measurement data to [/usr/local/mongo-spark/target/scala-2.11/scoverage-data]
[info] Compiling Scala sources and Java sources to /usr/local/mongo-spark/target/scala-2.11/test-classes...
// :: WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
// :: WARN Utils: Your hostname, hadoop1 resolves to a loopback address: 127.0.0.1; using 192.168.2.51 instead (on interface eno1)
// :: WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
// :: INFO log: Logging initialized @188125ms
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, age, name, filters:
[info] Test run started
[info] Test com.mongodb.spark.NoSparkConfTest.shouldBeAbleToUseConfigsWithRDDs started
[info] Test com.mongodb.spark.NoSparkConfTest.shouldBeAbleToUseConfigsWithDataFrames started
[info] Test run finished: failed, ignored, total, .87s
// :: INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec'
// :: INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec'
// :: INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateByCountPartitionerSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThan(_id,)
// :: INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec'
// :: WARN MongoPaginateByCountPartitioner: Inefficient partitioning, creating a partition per document. Decrease the `numberOfPartitions` property.
// :: INFO MongoPaginateByCountPartitioner: Empty collection (MongoPaginateByCountPartitionerSpec), using a single partition
// :: INFO MongoPaginateByCountPartitioner: Empty collection (MongoPaginateByCountPartitionerSpec), using a single partition
// :: INFO MongoPaginateByCountPartitionerSpec: Ended Test: 'MongoPaginateByCountPartitionerSpec'
[info] MongoPaginateByCountPartitionerSpec:
[info] MongoPaginateByCountPartitioner
[info] - should partition the database as expected
[info] - should partition on an alternative shardkey as expected
[info] - should use the users pipeline when set in a rdd / dataframe
[info] - should handle fewer documents than partitions
[info] - should handle no collection
[info] WriteConfigSpec:
[info] WriteConfig
[info] - should have the expected defaults
[info] - should be creatable from SparkConfig
[info] - should round trip options
[info] - should validate the values
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, age, name, filters:
// :: INFO SparkConfOverrideSpec: Ended Test: 'be able to able to override partial configs with options'
[info] SparkConfOverrideSpec:
[info] MongoRDD
[info] - should be able to override partial configs with Read / Write Configs
[info] DataFrame Readers and Writers
[info] - should be able to able to override partial configs with options
[info] Test run started
[info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldBeEasilyCreatedFromADataFrameAndSaveToMongo started
[info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldTakeACustomOptions started
[info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldBeEasilyCreatedFromMongoSpark started
[info] Test run finished: failed, ignored, total, .658s
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, arrayInt, binary, boolean, code, codeWithScope, date, dbPointer, document, double, int32, int64, maxKey, minKey, null, objectId, oldBinary, regex, string, symbol, timestamp, undefined, filters:
[info] Test run started
[info] Test com.mongodb.spark.sql.MongoDataFrameTest.shouldRoundTripAllBsonTypes started
[info] Test run finished: failed, ignored, total, .4s
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: name, age, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: a, filters:
// :: INFO MongoRelation: requiredColumns: a, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: a, filters:
// :: INFO MongoRelation: requiredColumns: a, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: _id, a, filters:
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: _id, a, filters:
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: , filters: IsNotNull(age)
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: , filters: IsNotNull(age)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: , filters: IsNotNull(age)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: _id, arrayInt, binary, bool, code, codeWithScope, date, dbPointer, dbl, decimal, document, int32, int64, maxKey, minKey, nullValue, objectId, oldBinary, regex, string, symbol, timestamp, undefined, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: nullValue, int32, int64, bool, date, dbl, decimal, string, minKey, maxKey, objectId, code, codeWithScope, regex, symbol, timestamp, undefined, binary, oldBinary, arrayInt, document, dbPointer, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: name, attributes, filters: IsNotNull(name)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, count, filters: IsNotNull(_id), IsNotNull(count)
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, count, filters: IsNotNull(_id), IsNotNull(count)
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: _id, name, age, filters:
// :: INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec'
// :: INFO MongoRelation: requiredColumns: _id, name, filters:
// :: INFO MongoRelation: requiredColumns: _id, name, age, filters:
// :: INFO MongoDataFrameSpec: Ended Test: 'MongoDataFrameSpec'
[info] MongoDataFrameSpec:
[info] DataFrameReader
[info] - should should be easily created from the SQLContext and load from Mongo
[info] - should handle selecting out of order columns
[info] - should handle mixed numerics with long precedence
[info] - should handle mixed numerics with double precedence
[info] - should handle array fields with null values
[info] - should handle document fields with null values
[info] - should be easily created with a provided case class
[info] - should include any pipelines when inferring the schema
[info] - should use any pipelines when set via the MongoRDD
[info] - should throw an exception if pipeline is invalid
[info] DataFrameWriter
[info] - should be easily created from a DataFrame and save to Mongo
[info] - should take custom writeConfig
[info] - should support INSERT INTO SELECT statements
[info] - should support INSERT OVERWRITE SELECT statements
[info] DataFrames
[info] - should round trip all bson types
[info] - should be able to cast all types to a string value
[info] - should be able to round trip schemas containing MapTypes
[info] - should be able to upsert and replace data in an existing collection
[info] - should be able to handle optional _id fields when upserting / replacing data in a collection
[info] - should be able to set only the data in the Dataset to the collection
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,)
// :: INFO MongoSamplePartitioner: Could not find collection (MongoSamplePartitionerSpec), using a single partition
// :: INFO MongoSamplePartitionerSpec: Ended Test: 'MongoSamplePartitionerSpec'
[info] MongoSamplePartitionerSpec:
[info] MongoSamplePartitioner
[info] - should partition the database as expected
[info] - should partition on an alternative shardkey as expected
[info] - should partition with a composite key
[info] - should use the users pipeline when set in a rdd / dataframe
[info] - should have a default bounds of min to max key
[info] - should handle no collection
[info] - should handle an empty collection
[info] ReadConfigSpec:
[info] ReadConfig
[info] - should have the expected defaults
[info] - should be creatable from SparkConfig
[info] - should use the URI for default values
[info] - should override URI values with named values
[info] - should round trip options
[info] - should be able to create a map
[info] - should create the expected ReadPreference and ReadConcern
[info] - should validate the values
// :: INFO MongoClientRefCounterSpec: Ended Test: 'MongoClientRefCounterSpec'
[info] MongoClientRefCounterSpec:
[info] MongoClientRefCounter
[info] - should count references as expected
[info] - should be able to acquire multiple times
[info] - should throw an exception for invalid releases of a MongoClient
// :: INFO MongoDBDefaults: Loading sample Data: ~5MB data into 'MongoSplitVectorPartitionerSpec'
// :: INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions.
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSplitVectorPartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSplitVectorPartitionerSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,)
// :: INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions.
// :: INFO MongoSplitVectorPartitioner: Could not find collection (MongoSplitVectorPartitionerSpec), using a single partition
// :: INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions.
// :: INFO MongoSplitVectorPartitionerSpec: Ended Test: 'MongoSplitVectorPartitionerSpec'
[info] MongoSplitVectorPartitionerSpec:
[info] MongoSplitVectorPartitioner
[info] - should partition the database as expected
[info] - should use the provided pipeline for min and max keys
[info] - should use the users pipeline when set in a rdd / dataframe
[info] - should have a default bounds of min to max key
[info] - should handle no collection
[info] - should handle an empty collection
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, maxKey, minKey, regularExpression, symbol, timestamp, undefined, filters:
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, maxKey, minKey, regularExpression, symbol, timestamp, undefined, filters:
[info] Test run started
[info] Test com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest.shouldBeAbleToCreateADatasetBasedOnAJavaBeanRepresentingComplexBsonTypes started
[info] Test com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest.shouldAllowTheRoundTrippingOfAJavaBeanRepresentingComplexBsonTypes started
[info] Test run finished: failed, ignored, total, .387s
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoConnectorSpec: Ended Test: 'MongoConnectorSpec'
[info] MongoConnectorSpec:
[info] MongoConnector
[info] - should create a MongoClient
[info] - should set the correct localThreshold
[info] - should Use the cache for MongoClients
[info] - should create a MongoClient with a custom MongoConnectionFactory
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec'
// :: INFO MongoPaginateBySizePartitioner: Inefficient partitioning, creating a single partition. Decrease the `partitionsizemb` property.
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec'
// :: INFO MongoPaginateBySizePartitionerSpec: Running Test: 'MongoPaginateBySizePartitionerSpec'
// :: INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThan(_id,)
// :: INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThan(_id,)
// :: INFO MongoPaginateBySizePartitioner: Inefficient partitioning, creating a single partition. Decrease the `partitionsizemb` property.
// :: INFO MongoPaginateBySizePartitioner: Could not find collection (MongoPaginateBySizePartitionerSpec), using a single partition
// :: INFO MongoPaginateBySizePartitioner: Empty collection (MongoPaginateBySizePartitionerSpec), using a single partition
// :: INFO MongoPaginateBySizePartitionerSpec: Ended Test: 'MongoPaginateBySizePartitionerSpec'
[info] MongoPaginateBySizePartitionerSpec:
[info] MongoPaginateBySizePartitioner
[info] - should partition the database as expected
[info] - should partition on an alternative shardkey as expected
[info] - should use the users pipeline when set in a rdd / dataframe
[info] - should have a default bounds of min to max key
[info] - should handle no collection
[info] - should handle an empty collection
// :: INFO MapFunctionsSpec: Ended Test: 'MapFunctionsSpec'
[info] MapFunctionsSpec:
[info] documentToRow
[info] - should convert a Document into a Row with the given schema
[info] - should not prune the schema when given a document with missing values
[info] - should prune the schema when limited by passed required columns
[info] - should ignore any extra data in the document that is not included in the schema
[info] - should handle nested schemas
[info] - should handle schemas containing maps
[info] - should throw an exception when passed maps without string keys
[info] rowToDocument
[info] - should convert a Row into a Document
[info] - should handle nested schemas
[info] - should handle nested schemas within nested arrays
[info] - should handle mixed numerics based on the schema
[info] - should throw a MongoTypeConversionException when casting to an invalid DataType
// :: INFO FieldTypesSpec: Running Test: 'FieldTypesSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, minKey, maxKey, regularExpression, symbol, timestamp, undefined, filters:
// :: INFO FieldTypesSpec: Running Test: 'FieldTypesSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, minKey, maxKey, regularExpression, symbol, timestamp, undefined, filters:
// :: INFO FieldTypesSpec: Ended Test: 'FieldTypesSpec'
[info] FieldTypesSpec:
[info] Fields
[info] - should allow the round tripping of a case class representing complex bson types
[info] - should be able to create a dataset based on a case class representing complex bson types
[info] - should be able to create a Regular Expression from a Pattern
[info] BsonValueOrderingSpec:
[info] The BsonValueOrdering trait
[info] - should order all bson types correctly
[info] - should compare numbers types correctly
[info] - should compare numbers and longs correctly
[info] - should compare string types correctly
[info] - should compare timestamp types correctly
[info] - should compare binary types correctly
[info] - should compare array types correctly
[info] - should compare regex types correctly
[info] - should compare document types correctly
[info] - should have no defined order for undefined types
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: WARN MongoClientCache: Release without acquire for key: Mongo{options=MongoClientOptions{description='null', applicationName='null', readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, fsync=null, journal=null, readConcern=com.mongodb.ReadConcern@, codecRegistry=org.bson.codecs.configuration.ProvidersCodecRegistry@8cef5306, commandListeners=[], clusterListeners=[], serverListeners=[], serverMonitorListeners=[], minConnectionsPerHost=, maxConnectionsPerHost=, threadsAllowedToBlockForConnectionMultiplier=, serverSelectionTimeout=, maxWaitTime=, maxConnectionIdleTime=, maxConnectionLifeTime=, connectTimeout=, socketTimeout=, socketKeepAlive=false, sslEnabled=false, sslInvalidHostNamesAllowed=false, alwaysUseMBeans=false, heartbeatFrequency=, minHeartbeatFrequency=, heartbeatConnectTimeout=, heartbeatSocketTimeout=, localThreshold=, requiredReplicaSetName='null', dbDecoderFactory=com.mongodb.DefaultDBDecoder$@1999df34, dbEncoderFactory=com.mongodb.DefaultDBEncoder$@4f04d552, socketFactory=null, cursorFinalizerEnabled=true, connectionPoolSettings=ConnectionPoolSettings{maxSize=, minSize=, maxWaitQueueSize=, maxWaitTimeMS=, maxConnectionLifeTimeMS=, maxConnectionIdleTimeMS=, maintenanceInitialDelayMS=, maintenanceFrequencyMS=}, socketSettings=SocketSettings{connectTimeoutMS=, readTimeoutMS=, keepAlive=false, receiveBufferSize=, sendBufferSize=}, serverSettings=ServerSettings{heartbeatFrequencyMS=, minHeartbeatFrequencyMS=, serverListeners='[]', serverMonitorListeners='[]'}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=, readTimeoutMS=, keepAlive=false, receiveBufferSize=, sendBufferSize=}}}
// :: WARN MongoClientCache: Release without acquire for key: Mongo{options=MongoClientOptions{description='null', applicationName='null', readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, fsync=null, journal=null, readConcern=com.mongodb.ReadConcern@, codecRegistry=org.bson.codecs.configuration.ProvidersCodecRegistry@8cef5306, commandListeners=[], clusterListeners=[], serverListeners=[], serverMonitorListeners=[], minConnectionsPerHost=, maxConnectionsPerHost=, threadsAllowedToBlockForConnectionMultiplier=, serverSelectionTimeout=, maxWaitTime=, maxConnectionIdleTime=, maxConnectionLifeTime=, connectTimeout=, socketTimeout=, socketKeepAlive=false, sslEnabled=false, sslInvalidHostNamesAllowed=false, alwaysUseMBeans=false, heartbeatFrequency=, minHeartbeatFrequency=, heartbeatConnectTimeout=, heartbeatSocketTimeout=, localThreshold=, requiredReplicaSetName='null', dbDecoderFactory=com.mongodb.DefaultDBDecoder$@1999df34, dbEncoderFactory=com.mongodb.DefaultDBEncoder$@4f04d552, socketFactory=null, cursorFinalizerEnabled=true, connectionPoolSettings=ConnectionPoolSettings{maxSize=, minSize=, maxWaitQueueSize=, maxWaitTimeMS=, maxConnectionLifeTimeMS=, maxConnectionIdleTimeMS=, maintenanceInitialDelayMS=, maintenanceFrequencyMS=}, socketSettings=SocketSettings{connectTimeoutMS=, readTimeoutMS=, keepAlive=false, receiveBufferSize=, sendBufferSize=}, serverSettings=ServerSettings{heartbeatFrequencyMS=, minHeartbeatFrequencyMS=, serverListeners='[]', serverMonitorListeners='[]'}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=, readTimeoutMS=, keepAlive=false, receiveBufferSize=, sendBufferSize=}}}
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoClientCacheSpec: Ended Test: 'MongoClientCacheSpec'
[info] MongoClientCacheSpec:
[info] MongoClientCache
[info] - should create a client and then close the client once released
[info] - should create a client and then close the client once released and after the timeout
[info] - should return a different client once released
[info] - should not throw an exception when trying to release unacquired client
[info] - should eventually close all released clients on shutdown
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,)
[info] Test run started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedViaMongoSpark started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldIncludeAnyPipelinesWhenInferringTheSchema started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedViaMongoSparkAndSQLContext started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldThrowAnExceptionIfPipelineIsInvalid started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedWithMongoSparkAndJavaBean started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedWithAProvidedRDDAndJavaBean started
[info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedFromTheSQLContext started
[info] Test run finished: failed, ignored, total, .098s
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN TestPartitioner: Could not find collection (MongoRDDSpec), using single partition
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
// :: ERROR Executor: Exception in task 0.0 in stage 24.0 (TID )
com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'})
at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:)
at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$$$anon$.hasNext(WholeStageCodegenExec.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:)
at org.apache.spark.scheduler.Task.run(Task.scala:)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:)
// :: WARN TaskSetManager: Lost task 0.0 in stage 24.0 (TID , localhost, executor driver): com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'})
at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:)
at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$$$anon$.hasNext(WholeStageCodegenExec.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:)
at org.apache.spark.scheduler.Task.run(Task.scala:)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:) // :: ERROR TaskSetManager: Task in stage 24.0 failed times; aborting job
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Running Test: 'MongoRDDSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRDDSpec: Ended Test: 'MongoRDDSpec'
[info] MongoRDDSpec:
[info] MongoRDD
[info] - should be easily created from the SparkContext
[info] - should be able to handle non existent collections
[info] - should be able to query via a pipeline
[info] - should be able to handle different collection types
[info] - should be able to create a DataFrame by inferring the schema
[info] - should be able to create a DataFrame when provided a case class
[info] - should be able to create a DataFrame with a set schema
[info] - should be able to create a Dataset when provided a case class
[info] - should not allow Nothing when trying to create a Dataset
[info] - should throw when creating a Dataset with invalid data
[info] - should use default values when creating a Dataset with missing data
[info] - should be easy to use a custom partitioner
[info] - should be easy to use a custom partitioner that is an object
// :: INFO MongoRelationHelperSpec: Ended Test: 'MongoRelationHelperSpec'
[info] MongoRelationHelperSpec:
[info] createPipeline
[info] - should create an empty pipeline if no projection or filters
[info] - should project the required fields
[info] - should explicitly exclude _id from the projection if not required
[info] - should handle spark Filters
[info] - should and multiple spark Filters
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
// :: INFO MongoClientCache: Creating MongoClient: [localhost:]
[info] Test run started
[info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnectorWithCustomMongoClientFactory started
[info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnector started
[info] Test com.mongodb.spark.MongoConnectorTest.shouldUseTheMongoClientCache started
[info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnectorFromJavaSparkContext started
[info] Test run finished: failed, ignored, total, .043s
// :: INFO MongoShardedPartitionerSpec: Ended Test: 'MongoShardedPartitionerSpec'
[info] MongoShardedPartitionerSpec:
[info] MongoShardedPartitioner
[info] - should partition the database as expected !!! CANCELED !!!
[info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:)
[info] - should have a default bounds of min to max key !!! CANCELED !!!
[info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:)
[info] - should handle no collection !!! CANCELED !!!
[info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:)
[info] - should handle an empty collection !!! CANCELED !!!
[info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:)
[info] - should calculate the expected hosts for a single node shard
[info] - should calculate the expected hosts for a multi node shard
[info] - should return distinct hosts
[info] - should calculate the expected Partitions
// :: INFO PartitionerHelperSpec: Ended Test: 'PartitionerHelperSpec'
[info] PartitionerHelperSpec:
[info] PartitionerHelper
[info] - should create the expected partitions query
[info] - should create the correct partitions
// :: INFO DefaultMongoClientFactorySpec: Ended Test: 'DefaultMongoClientFactorySpec'
[info] DefaultMongoClientFactorySpec:
[info] DefaultMongoClientFactory
[info] - should create a MongoClient from the connection string
[info] - should implement equals based on the prefix less options map
[info] - should set the localThreshold correctly
[info] - should validate the connection string
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN TestPartitioner: Could not find collection (com.mongodb.spark.MongoSparkTest), using single partition
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: ERROR Executor: Exception in task 0.0 in stage 1.0 (TID )
com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'})
at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:)
at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$$$anon$.hasNext(WholeStageCodegenExec.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:)
at org.apache.spark.scheduler.Task.run(Task.scala:)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:)
// :: WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID , localhost, executor driver): com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'})
at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at com.mongodb.spark.sql.MapFunctions$$anonfun$.apply(MapFunctions.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.TraversableLike$$anonfun$map$.apply(TraversableLike.scala:)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:)
at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$.apply(MongoRelation.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at scala.collection.Iterator$$anon$.next(Iterator.scala:)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$$$anon$.hasNext(WholeStageCodegenExec.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$.apply(SparkPlan.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$$$anonfun$apply$.apply(RDD.scala:)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:)
at org.apache.spark.scheduler.Task.run(Task.scala:)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:) // :: ERROR TaskSetManager: Task in stage 1.0 failed times; aborting job
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: counter, filters:
[info] Test run started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameUsingJavaBean started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToHandleNoneExistentCollections started
[info] Test com.mongodb.spark.MongoSparkTest.shouldThrowWhenCreatingADatasetWithInvalidData started
[info] Test com.mongodb.spark.MongoSparkTest.useDefaultValuesWhenCreatingADatasetWithMissingData started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameByInferringTheSchemaUsingSparkSession started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameByInferringTheSchema started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToQueryViaAPipeLine started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADatasetUsingJavaBeanWithSparkSession started
[info] Test com.mongodb.spark.MongoSparkTest.useACustomPartitioner started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeCreatableFromTheSparkContextWithAlternativeReadAndWriteConfigs started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeCreatableFromTheSparkContext started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToHandleDifferentCollectionTypes started
[info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADatasetUsingJavaBean started
[info] Test run finished: failed, ignored, total, .207s
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: _id, age, name, filters:
// :: INFO NoSparkConfSpec: Ended Test: 'be able to accept just options'
[info] NoSparkConfSpec:
[info] MongoRDD
[info] - should be able to accept just Read / Write Configs
[info] DataFrame Readers and Writers
[info] - should be able to accept just options
[info] Test run started
[info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromAJavaMap started
[info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromTheSparkConf started
[info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started
[info] Test run finished: failed, ignored, total, .002s
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN TaskSetManager: Stage contains a task of very large size ( KB). The maximum recommended task size is KB.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec'
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN MongoInferSchema: Array Field 'a' contains conflicting types converting to StringType
// :: WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
// :: WARN MongoInferSchema: Array Field 'a' contains conflicting types converting to StringType
// :: INFO MongoInferSchemaSpec: Ended Test: 'MongoInferSchemaSpec'
[info] MongoInferSchemaSpec:
[info] MongoSchemaHelper
[info] - should be able to infer the schema from simple types
[info] - should be able to infer the schema from a flat array
[info] - should be able to infer the schema from a flat document
[info] - should be able to infer the schema from a nested array
[info] - should be able to infer the schema from a multi level document
[info] - should be able to infer the schema with custom sampleSize
[info] - should ignore empty arrays and null values in arrays
[info] - should use any set pipelines on the RDD
[info] - should upscale number types based on numeric precedence
[info] - should be able to infer the schema from arrays with mixed keys
[info] - should be able to infer the schema from arrays with mixed numerics
[info] - should be able to infer the schema from nested arrays with mixed keys
[info] - should still mark incompatible schemas with a StringType
[info] Test run started
[info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMap started
[info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf started
[info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started
[info] Test run finished: failed, ignored, total, .003s
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: INFO MongoRelation: requiredColumns: binary, filters: IsNotNull(binary)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: oldBinary, filters: IsNotNull(oldBinary)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: dbPointer, filters: IsNotNull(dbPointer)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: code, filters: IsNotNull(code)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: codeWithScope, filters: IsNotNull(codeWithScope)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: maxKey, filters: IsNotNull(maxKey)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: minKey, filters: IsNotNull(minKey)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: objectId, filters: IsNotNull(objectId)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: regex, filters: IsNotNull(regex)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: regexWithOptions, filters: IsNotNull(regexWithOptions)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: symbol, filters: IsNotNull(symbol)
// :: INFO HelpersSpec: Running Test: 'HelpersSpec'
// :: INFO MongoRelation: requiredColumns: timestamp, filters: IsNotNull(timestamp)
// :: INFO HelpersSpec: Ended Test: 'HelpersSpec'
[info] HelpersSpec:
[info] the user defined function helpers
[info] - should handle Binary values
[info] - should handle Binary values with a subtype
[info] - should handle DbPointers
[info] - should handle JavaScript
[info] - should handle JavaScript with scope
[info] - should handle maxKeys
[info] - should handle minKeys
[info] - should handle ObjectIds
[info] - should handle Regular Expressions
[info] - should handle Regular Expressions with options
[info] - should handle Symbols
[info] - should handle Timestamps
// :: WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
// :: WARN TestPartitioner: Could not find collection (beabletoconnecttoanauthenticateddb), using single partition
// :: INFO AuthConnectionSpec: Ended Test: 'AuthConnectionSpec'
[info] AuthConnectionSpec:
[info] MongoRDD
[info] - should be able to connect to an authenticated db
[info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] Reading scoverage measurements...
[info] Generating scoverage reports...
[info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml]
[info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml]
[info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html]
[info] Coverage reports completed
[info] All done. Coverage was [80.60%]
[info] ScalaCheck
[info] Passed: Total , Failed , Errors , Passed
[info] ScalaTest
[info] Run completed in seconds, milliseconds.
[info] Total number of tests run:
[info] Suites: completed , aborted
[info] Tests: succeeded , failed , canceled , ignored , pending
[info] All tests passed.
[info] Passed: Total , Failed , Errors , Passed , Canceled
[success] Total time: s, completed -- ::
[info] Aggregating coverage from subprojects...
[info] Found subproject report files [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml]
[info] No subproject data to aggregate, skipping reports
[success] Total time: s, completed -- ::
[info] Waiting for measurement data to sync...
[info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] Reading scoverage measurements...
[info] Generating scoverage reports...
[info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml]
[info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml]
[info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html]
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
[info] Coverage reports completed
[success] Total time: s, completed -- ::
// :: INFO MongoClientCache: Closing MongoClient: [localhost:]
[root@hadoop1 mongo-spark]#

mongodb/mongo-spark: The MongoDB Spark Connector https://github.com/mongodb/mongo-spark

./sbt check 报错后

控制台提示没有权限

ps -aux | grep mongo;

杀死需要权限认证的mongodb服务(./mongodb --port 27017 --auth),调整为./mongodb --port 27017

之后重新./sbt check

ok

mongo-spark 安装排故 ./sbt check的更多相关文章

  1. Hive on Spark安装配置详解(都是坑啊)

    个人主页:http://www.linbingdong.com 简书地址:http://www.jianshu.com/p/a7f75b868568 简介 本文主要记录如何安装配置Hive on Sp ...

  2. (转)Spark安装与学习

    摘要:Spark是继Hadoop之后的新一代大数据分布式处理框架,由UC Berkeley的Matei Zaharia主导开发.我只能说是神一样的人物造就的神器,详情请猛击http://www.spa ...

  3. spark 安装配置

    最佳参考链接 https://opensourceteam.gitbooks.io/bigdata/content/spark/install/spark-160-bin-hadoop26an_zhu ...

  4. [转] Spark快速入门指南 – Spark安装与基础使用

    [From] https://blog.csdn.net/w405722907/article/details/77943331 Spark快速入门指南 – Spark安装与基础使用 2017年09月 ...

  5. Spark学习(一) -- Spark安装及简介

    标签(空格分隔): Spark 学习中的知识点:函数式编程.泛型编程.面向对象.并行编程. 任何工具的产生都会涉及这几个问题: 现实问题是什么? 理论模型的提出. 工程实现. 思考: 数据规模达到一台 ...

  6. Windows环境中Openfire与Spark安装与配置指南

    安装软件: openfire3.9.3 spark2.6.3 安装环境: WindowsXP JDK1.6.0_21 Oracle 一.openfire安装 1.安装openfire3.9.3,下载地 ...

  7. mongo db安装和php,python插件安装

    安装mongodb 1.下载,解压mongodb(下载解压目录为/opt) 在/opt目录下执行命令 wget fastdl.mongodb.org/linux/mongodb-linux-x86_6 ...

  8. spark安装mysql与hive

    第一眼spark安装文件夹lib\spark-assembly-1.0.0-hadoop2.2.0.jar\org\apache\spark\sql下有没有hive文件夹,假设没有的话先下载支持hiv ...

  9. 一、Mongo的安装

    注:学习为主,平台为WIN7 32位系统 一.Mongo的安装 1.1 下载 到官方下载地址(http://www.mongodb.org/downloads)去下载所需要的版本 1.2 安装与运行 ...

随机推荐

  1. xmpp 环境配置

    XMPP框架地址:http://xmpp.org/xmpp-software/libraries/ 配置流程

  2. 使用Unity做2.5D游戏教程(二)

    最近在研究Unity 3D,看了老外Marin Todorov写的教程很详细,就翻译过来以便自己参考,翻译不好的地方请多包涵. 这是使用Unity 游戏开发工具制作一个简单的2.5D 游戏系列教程的第 ...

  3. Shell脚本学习指南 [ 第一、二章 ] 背景知识、入门

    摘要:第一章介绍unix系统的发展史及软件工具的设计原则.第二章介绍编译语言与脚本语言的区别以及两个相当简单但很实用的Shell脚本程序,涵盖范围包括了命令.选项.参数.Shell变量.echo与pr ...

  4. Codeforces 333E Summer Earnings ——Bitset

    [题目分析] 找一个边长最大的三元环. 把边排序,然后依次加入.加入(i,j)时,把i和j取一个交集,看看是否存在,存在就找到了最大的三元环. 输出即可,n^3/64水过. [代码] #include ...

  5. [BZOJ1584] [Usaco2009 Mar]Cleaning Up 打扫卫生(DP)

    传送门 不会啊,看了好久的题解才看懂 TT 因为可以直接分成n段,所以就得到一个答案n,求解最小的答案,肯定是 <= n 的, 所以每一段中的不同数的个数都必须 <= sqrt(n),不然 ...

  6. [luoguP3608] [USACO17JAN]Balanced Photo平衡的照片(树状数组 + 离散化)

    传送门 树状数组裸题 #include <cstdio> #include <cstring> #include <iostream> #include <a ...

  7. BZOJ1923 [Sdoi2010]外星千足虫 【高斯消元】

    题目 输入格式 第一行是两个正整数 N, M. 接下来 M行,按顺序给出 Charles 这M次使用"点足机"的统计结果.每行 包含一个"01"串和一个数字,用 ...

  8. 装B技能GET起来!Apple Pay你会用了吗?

    科技圈儿有一个自带光环的品牌 它每次一有任何动静 不用宣传 也不用刻意营销 消息还是能传天下 2月18日 你敢说你的朋友圈儿没有被下面这个词儿刷屏? Apple Pay 这不,我就跟着凑凑热闹,开个小 ...

  9. PAT (Advanced Level) 1087. All Roads Lead to Rome (30)

    暴力DFS. #include<cstdio> #include<cstring> #include<cmath> #include<vector> # ...

  10. Django实现的博客系统中使用富文本编辑器ckeditor

    操作系统为OS X 10.9.2,Django为1.6.5. 1.下载和安装 1.1 安装 ckeditor 下载地址 https://github.com/shaunsephton/django-c ...