-
Notifications
You must be signed in to change notification settings - Fork 464
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[VL] can not write hive table on HDFS #5879
Comments
@RaoZhiRou-Z, did you do the test with gluten main branch? |
yes,the main branch |
@RaoZhiRou-Z, it seems a bug. I created one PR to fix it: #5881. Please take a review. |
thanks,I will have a try |
@PHILO-HE I have tried the bug-fix,and there is another core error,the core stack shows that: #0 __GI_raise (sig=sig@entry=6) at /root/work/deck/devel/toolchain/glibc-2.33/signal/raise.c:49 |
@RaoZhiRou-Z, it's strange. Could you share some details about your test for reproducing? |
Description
when add the config "spark.gluten.sql.native.writer.enabled true",and an error occurred:"The file path is not local when writing data with parquet format in velox runtime!"
Does velox not support writing hive table on HDFS now?
void VeloxParquetDatasource::init(const std::unordered_map<std::string, std::string>& sparkConfs) { if (strncmp(filePath_.c_str(), "file:", 5) == 0) { sink_ = dwio::common::FileSink::create(filePath_, {.pool = pool_.get()}); } else { throw std::runtime_error("The file path is not local when writing data with parquet format in velox runtime!"); } ......... }
The text was updated successfully, but these errors were encountered: