1 | initial version |
My name is Octavio Glorio, I am Data Architect in Flow Computing, a knowledge-intensive startup. Currently we are working on some data analysis report for fishery industry. We start working on old cluster but we are having storage problems. We have 1.5G of plain documents and then we produce a Hive table with them. We try to start working in the new Cosmos Global Instance, however, seems that we need to contact Cosmos Big Data Team to get a space available on HDFS (computing or storage cluster). Is that correct? It is our outstanding of the info that we had read. Thanks in advance. Regards, Octavio.