thrift - In Apache Spark SQL, How to close metastore connection from HiveContext -


my project has unit tests different hivecontext configurations (sometimes in 1 file grouped features.)

after upgrading spark 1.4 encounter lot of 'java.sql.sqlexception: instance of derby may have booted database' problems, patch make contexts unable share same metastore. since not clean revert state of singleton every test. option boils down "recycle" each context terminating previous derby metastore connection. there way this?

well in scala used funsuite unit tests beforeandafterall trait. can init sparkcontext in beforeall, spawn hivecontext , finish this:

  override def afterall(): unit = {     if(sparkcontext != null)       sparkcontext .stop()   } 

from i've noticed closes hivecontext attached it.


Comments

Popular posts from this blog

java - UnknownEntityTypeException: Unable to locate persister (Hibernate 5.0) -

python - ValueError: empty vocabulary; perhaps the documents only contain stop words -

ubuntu - collect2: fatal error: ld terminated with signal 9 [Killed] -