Comments (5)
I don't like it, but I used the following function (from http://r.789695.n4.nabble.com/write-csv-to-text-string-td4704560.html) to write a temporary file and then uploaded that using the AzureSMR API.
write.csv.str <- function(df, row.names = F) {
filepath <- tempfile()
f <- file(filepath, "w")
write.csv(df, f, row.names = row.names)
close(f)
s <- readr::read_file(filepath)
s
}
from azuresmr.
@rsmith54 This is the method I am using, thanks for providing.
I would suggest you use fwrite from the data.table package instead of write.csv. For text columns, write.csv won't delimit them correctly so when you read the blob, it will error.
from azuresmr.
Have you been able to achieve this successfully? If yes can you share your code? For me it's failing and I pasted my issue here as well.
from azuresmr.
Maybe you can try this (writing iris dataframe to blob) :
write.csv(iris, textConnection("irisString", "w"), row.names = F)
azurePutBlob(sc, storageAccount = "testmystorage1", container = "opendata",
contents = paste(irisString, collapse = '\n'),
blob = "HELLO")
from azuresmr.
For large DFs, I created a function that will load the data into large chunks and read the loaded data back into a DF
save_chunks<-function(df,tenant_id,client_id,auth_key,auth_type,resource_group,storage_account,container_name,blob_folder,blob_name){
write.csv.str <- function(df, row.names = F) {
filepath <- tempfile()
fwrite(df, filepath, row.names = row.names)
s <- readr::read_file(filepath)
s
}
end <- 0
id <- 1
start <- 1
num_rows<-round(64000000/((object.size(df)[1])/nrow(df)),0)
end <- start+num_rows
sc <- createAzureContext(tenantID =tenant_id
#application id = clientID
,clientID = client_id
#the password
,authKey= auth_key
,authType = auth_type)
azureSAGetKey(sc,resourceGroup = resource_group, storageAccount = storage_account )
while (end <= nrow(df) ) {
#get size of one row and figure out how many rows per chunk, limiting to 64 MB
#id will save multiple files
chunk<-df[start:end,]
azurePutBlob(sc
,storageAccount = storage_account
,container = container_name
,contents = write.csv.str(chunk)
,blob = paste(blob_folder,'/',blob_name,'_',id,sep=''))
if (end == nrow(df) )
return(NULL)
id<- id+1
start <- end+1
end <- start+num_rows
if (end >= nrow(df))
end <- nrow(df)
print(start)
print(end)
}
}
get_blob_data <- function(tenant_id,client_id,auth_key,auth_type,resource_group,storage_account,container_name,blob_folder,blob_name) {
sc <- createAzureContext(tenantID =tenant_id
#application id = clientID
,clientID = client_id
#the password
,authKey= auth_key
,authType = auth_type)
azureSAGetKey(sc,resourceGroup = resource_group, storageAccount = storage_account )
all_files<-azureBlobLS(sc,directory = blob_folder,recursive = F,storageAccount = storage_account,resourceGroup = resource_group,container = container_name)
total_files <- nrow(all_files)
print(total_files)
start <- 1
while (start <=total_files){
print (start)
data<-azureGetBlob(azureActiveContext = sc,storageAccount = storage_account, directory = blob_folder,container = container_name,resourceGroup = resource_group,blob=paste(blob_name,'_',start,sep=''))
filepath<-tempfile()
write_lines(data,filepath)
if (!exists('blob_df')){
blob_df<-fread(filepath,showProgress = F)
start <- start+1
} else {
blob_df %<>% rbind(.,fread(filepath, showProgress = F))
start<-start+1
}
}
return(blob_df)
}
from azuresmr.
Related Issues (20)
- Undefined index_resource in azureDataConsumption()
- install_github failing HOT 1
- Issue authenticating with tutorial HOT 5
- Can't read/write to blob storage HOT 2
- azureCreateStorageAccount fails while storage account does not exist HOT 1
- license HOT 1
- Extracting an excel file with multiple tabs from Azure to R HOT 1
- Read .gz file from Data Lake HOT 1
- R 3.5 Support
- download more than 5000 blobs HOT 1
- Error trying to write to Blob
- azureDataLake functions don't work HOT 2
- AzurePutBlob should take contenttype as an argument
- DeviceCode authentication doesn't work HOT 2
- resourceGroup should be optional when providing a storage key
- Use HTTPS when communicating with Storage API
- Storing raw objects does not work due to invalid authorisation signature HOT 1
- Support ADLS as alternative to storage account in HDI HOT 1
- Authentication Rmd outdated
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from azuresmr.