Hi , I've used phoenix table to store billions of rows , rows are incrementally insert into phoenix by spark every day and the table was for instant query from web page by providing primary key . so far so good .
Thanks Dalin On Mon, Sep 12, 2016 at 10:07 AM, Cheyenne Forbes < [email protected]> wrote: > Thanks everyone, I will be using phoenix for simple input/output and > the phoenix_spark plugin (https://phoenix.apache.org/phoenix_spark.html) > for more complex queries, is that the smart thing? > > Regards, > > Cheyenne Forbes > > Chief Executive Officer > Avapno Omnitech > > Chief Operating Officer > Avapno Solutions, Co. > > Chairman > Avapno Assets, LLC > > Bethel Town P.O > Westmoreland > Jamaica > > Email: [email protected] > Mobile: 876-881-7889 > skype: cheyenne.forbes1 > > > On Sun, Sep 11, 2016 at 11:07 AM, Ted Yu <[email protected]> wrote: > >> w.r.t. Resource Management, Spark also relies on other framework such as >> YARN or Mesos. >> >> Cheers >> >> On Sun, Sep 11, 2016 at 6:31 AM, John Leach <[email protected]> wrote: >> >>> Spark has a robust execution model with the following features that are >>> not part of phoenix >>> * Scalable >>> * fault tolerance with lineage (Handles large intermediate >>> results) >>> * memory management for tasks >>> * Resource Management (Fair Scheduling) >>> * Additional SQL Features (Windowing ,etc.) >>> * Machine Learning Libraries >>> >>> >>> Regards, >>> John >>> >>> > On Sep 11, 2016, at 2:45 AM, Cheyenne Forbes < >>> [email protected]> wrote: >>> > >>> > I realized there is a spark plugin for phoenix, any use cases? why >>> would I use spark with phoenix instead of phoenix by itself? >>> >>> >> >
