09-16-2016, 09:33 AM
So I've been working with my pine for a few months now, trying to get a hadoop server running to support my work as an integration architect. My plan is to use the pine as a linux server, running multiple hadoop clusters, a postgres db and proprietary integration software from my employer. As an experiment I simply installed everything on the micro sd card, but that is not sustainable as far as size, so I experimented with using a usb thumbdrive as a pic. That only worked somewhat since I had to muck around with disk mounts etc, and ultimately as soon as I had it all working a bad shutdown corrupted the thumbdrive sending me back to square one.
I have since invested in a 1tb usb hdd, a powered usb hub, and now need to the optimal way of setting it all up. My first question is is it best to reformat the external hdd to a linux partition or leave it as ntfs. I know that ntfs has limitations with setting privileges, which is a issue with postgres. How would the linux gurus approach this?
I have since invested in a 1tb usb hdd, a powered usb hub, and now need to the optimal way of setting it all up. My first question is is it best to reformat the external hdd to a linux partition or leave it as ntfs. I know that ntfs has limitations with setting privileges, which is a issue with postgres. How would the linux gurus approach this?