sparkR could not find function "textFile"
Yeah, that’s probably because the head() you’re invoking there is defined for SparkR DataFrames
[1] (note how you don’t have to use the SparkR::: namepsace in front of it), but SparkR:::textFile()
returns an RDD object, which is more like a distributed list data structure the way you’re
applying it over that .md text file. If you want to look at the first item or first several
items in the RDD, I think you want to use SparkR:::first() or SparkR:::take(), both of which
are applied to RDDs.
Just remember that all the functions described in the public API [2] for SparkR right now
are related mostly to working with DataFrames. You’ll have to use the R command line doc
or look at the RDD source code for all the private functions you might want (which includes
the doc strings used to make the R doc), whichever you find easier.
Alek
[1] -- http://spark.apache.org/docs/latest/api/R/head.html
[2] -- https://spark.apache.org/docs/latest/api/R/index.html
[3] -- https://github.com/apache/spark/blob/master/R/pkg/R/RDD.R
From: Wei Zhou <zhweisophie@gmail.com<mailto:zhweisophie@gmail.com>>
Date: Thursday, June 25, 2015 at 3:49 PM
To: Aleksander Eskilson <Alek.Eskilson@cerner.com<mailto:Alek.Eskilson@cerner.com>>
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: sparkR could not find function "textFile"
Hi Alek,
Just a follow up question. This is what I did in sparkR shell:
lines <- SparkR:::textFile(sc, "./README.md")
head(lines)
And I am getting error:
"Error in x[seq_len(n)] : object of type 'S4' is not subsettable"
I'm wondering what did I do wrong. Thanks in advance.
Wei
2015-06-25 13:44 GMT-07:00 Wei Zhou <zhweisophie@gmail.com<mailto:zhweisophie@gmail.com>>:
Hi Alek,
Thanks for the explanation, it is very helpful.
Cheers,
Wei
2015-06-25 13:40 GMT-07:00 Eskilson,Aleksander <Alek.Eskilson@cerner.com<mailto:Alek.Eskilson@cerner.com>>:
Hi there,
The tutorial you’re reading there was written before the merge of SparkR for Spark 1.4.0
For the merge, the RDD API (which includes the textFile() function) was made private, as the
devs felt many of its functions were too low level. They focused instead on finishing the
DataFrame API which supports local, HDFS, and Hive/HBase file reads. In the meantime, the
devs are trying to determine which functions of the RDD API, if any, should be made public
again. You can see the rationale behind this decision on the issue’s JIRA [1].
You can still make use of those now private RDD functions by prepending the function call
with the SparkR private namespace, for example, you’d use
SparkR:::textFile(…).
Hope that helps,
Alek
[1] -- https://issues.apache.org/jira/browse/SPARK-7230<https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D7230&d=AwMFaQ&c=NRtzTzKNaCCmhN_9N2YJR-XrNU1huIgYP99yDsEzaJo&r=0vZw1rBdgaYvDJYLyKglbrax9kvQfRPdzxLUyWSyxPM&m=7RxLcWCdPWHoYk05KGwnohDZDileOX4Wo7Ht5SFge4I&s=ruNsApqV-sn8sBzSgJW0PIZ5beD_TvhLulQjeabR7p8&e=>
From: Wei Zhou <zhweisophie@gmail.com<mailto:zhweisophie@gmail.com>>
Date: Thursday, June 25, 2015 at 3:33 PM
To: "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: sparkR could not find function "textFile"
Hi all,
I am exploring sparkR by activating the shell and following the tutorial here https://amplab-extras.github.io/SparkR-pkg/<https://urldefense.proofpoint.com/v2/url?u=https-3A__amplab-2Dextras.github.io_SparkR-2Dpkg_&d=AwMFaQ&c=NRtzTzKNaCCmhN_9N2YJR-XrNU1huIgYP99yDsEzaJo&r=0vZw1rBdgaYvDJYLyKglbrax9kvQfRPdzxLUyWSyxPM&m=aL4A2Pv9tHbhgJUX-EnuYx2HntTnrqVpegm6Ag-FwnQ&s=qfOET1UvP0ECAKgnTJw8G13sFTi_PhiJ8Q89fMSgH_Q&e=>
And when I tried to read in a local file with textFile(sc, "file_location"), it gives an error
could not find function "textFile".
By reading through sparkR doc for 1.4, it seems that we need sqlContext to import data, for
example.
people <- read.df(sqlContext, "./examples/src/main/resources/people.json", "json"
)
And we need to specify the file type.
My question is does sparkR stop supporting general type file importing? If not, would appreciate
any help on how to do this.
PS, I am trying to recreate the word count example in sparkR, and want to import README.md
file, or just any file into sparkR.
Thanks in advance.
Best,
Wei
CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation
and are intended only for the addressee. The information contained in this message is confidential
and may constitute inside or non-public information under international, federal, or state
securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such
information is strictly prohibited and may be unlawful. If you are not the addressee, please
promptly delete this message and notify the sender of the delivery error by e-mail or you
may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024<tel:%28%2B1%29%20%28816%29221-1024>.
sparkR could not find function "textFile"的更多相关文章
- j解决sparkr中使用某些r的原生函数 发生错误Error: class(objId) == "jobj" is not TRUE的问题
Create table function in Spark in R not working João_Andre (3) 询问的问题 | 2016年12月10日 06:03BLUEMIXRSPA ...
- 通过百度echarts实现数据图表展示功能
现在我们在工作中,在开发中都会或多或少的用到图表统计数据显示给用户.通过图表可以很直观的,直接的将数据呈现出来.这里我就介绍说一下利用百度开源的echarts图表技术实现的具体功能. 1.对于不太理解 ...
- 在CentOS上安装并运行SparkR
环境配置—— 操作系统:CentOS 6.5 JDK版本:1.7.0_67 Hadoop集群版本:CDH 5.3.0 安装过程—— 1.安装R yum install -y R 2.安装curl-de ...
- Apache Spark技术实战之5 -- SparkR的安装及使用
欢迎转载,转载请注明出处,徽沪一郎. 概要 根据论坛上的信息,在Sparkrelease计划中,在Spark 1.3中有将SparkR纳入到发行版的可能.本文就提前展示一下如何安装及使用SparkR. ...
- SparkR安装部署及数据分析实例
1. SparkR的安装配置 1.1. R与Rstudio的安装 1.1.1. R的安装 我们的工作环境都是在Ubuntu下操作的,所以只介绍Ubuntu下安装R的方法 ...
- (转载)SPARKR,对RDD操作的介绍
原以为,用sparkR不能做map操作, 搜了搜发现可以. lapply等同于map, 但是不能操作spark RDD. spark2.0以后, sparkR增加了 dapply, dapplycol ...
- SPARKR,对RDD操作的介绍
(转载)SPARKR,对RDD操作的介绍 原以为,用sparkR不能做map操作, 搜了搜发现可以. lapply等同于map, 但是不能操作spark RDD. spark2.0以后, spar ...
- sparkR介绍及安装
sparkR介绍及安装 SparkR是AMPLab发布的一个R开发包,为Apache Spark提供了轻量的前端.SparkR提供了Spark中弹性分布式数据集(RDD)的API,用户可以在集群上通过 ...
- SparkR初体验2.0
突然有个想法,R只能处理百万级别的数据,如果R能运行在Spark上多好!搜了下发现13年SparkR这个项目就启动了,感谢美帝! 1.你肯定得先装个spark吧.看这:Spark本地模式与Spark ...
随机推荐
- Python学习笔记——与爬虫相关的网络知识
1 关于URL URL(Uniform / Universal Resource Locator):统一资源定位符,用于完整地描述Internet上网页和其他资源的地址的一种标识方法 URL是爬虫的入 ...
- js 去掉前后空格
前后去空格 return str.replace(/(^\s*)|(\s*$)/g, ""); 全部去空格 $("#panelbody").text().rep ...
- django 模板使用静态文件
1.新建项目 2.新建app,并在install_app中添加该app 3.和app文件夹并列新建static.和TEMPLATES 文件夹,分别放静态文件和模板 4.setting.py中设置 T ...
- 如何在virtualenv虚拟环境中安装mysql-python
接触过virtualenv后,想在这个虚拟环境中安装独立的开发环境.在安装MySQLdb时遇到错误 pc 09:09:30 File "/home/pc/work/VENV/py3/loca ...
- JavaScript与DOM(下)
介绍 上一章我们介绍了JavaScript的基本内容和DOM对象的各个方面,包括如何访问node节点.本章我们将讲解如何通过DOM操作元素并且讨论浏览器事件模型. 本文参考:http://net.tu ...
- Hadoop启动过程分析
先把客户端修好,后续慢慢写. 菊子曰:体验离线写博的乐趣
- java中代理,静态代理,动态代理以及spring aop代理方式,实现原理统一汇总
若代理类在程序运行前就已经存在,那么这种代理方式被成为 静态代理 ,这种情况下的代理类通常都是我们在Java代码中定义的. 通常情况下, 静态代理中的代理类和委托类会实现同一接口或是派生自相同的父类. ...
- jquery图片放大功能简单实现
图片放大在某些例如商品细节放大图比较常见,本文写了一个图片放大的示例适合日常应付,有需求的朋友可以参考下 <div class="jqzoom"> <img sr ...
- man手册语法格式
Linux命令很多,但对格式本身解读的文章几乎是空白,都在凭对格式的猜测来写命令,就此在网上搜集此类资料都很少而且很不全面,想找官方的,也没找到.根据自己的理解写一篇出来,希望对初学者有用. 一. ...
- 【Android】第21章 2D图形和动画
分类:C#.Android.VS2015: 创建日期:2016-03-19 一.简介 Android系统定义了一系列独立的图形处理类,其中,2D图形处理类分别位于以下命名空间: Android.Gra ...