Here are the changes for the hanging issues: https://asterix-gerrit.ics.uci.edu/#/c/491/ https://asterix-gerrit.ics.uci.edu/#/c/492/
Both changes' jenkins builds passes Can anyone review those so that further jenkins builds can be unblocked? Thanks! Best, Yingyi On Thu, Nov 12, 2015 at 8:01 PM, Yingyi Bu <[email protected]> wrote: > OK, I'll reproduce that. > Thanks! > > Best, > Yingyi > > On Thu, Nov 12, 2015 at 7:23 PM, Murtadha Hubail <[email protected]> > wrote: > >> Hi Yingyi, >> >> I think this merge (https://asterix-gerrit.ics.uci.edu/#/c/487/) caused >> Asterix recovery test cases to get stuck when duplicate key exception >> happens. >> >> Could you please have a look at it? You can reproduce it with the >> statements below. >> >> @Others, >> I’m also getting the below exception on the current master every time I >> start AsterixHyracksIntegrationUtil or during tests. Is anyone experiencing >> the same? >> >> java.net.ConnectException: Connection refused >> at java.net.PlainSocketImpl.socketConnect(Native Method) >> at >> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) >> at >> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) >> at >> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) >> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) >> at java.net.Socket.connect(Socket.java:589) >> at java.net.Socket.connect(Socket.java:538) >> at java.net.Socket.<init>(Socket.java:434) >> at java.net.Socket.<init>(Socket.java:211) >> at >> org.apache.asterix.common.feeds.FeedMessageService$FeedMessageHandler.run(FeedMessageService.java:101) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) >> at java.lang.Thread.run(Thread.java:745) >> >> -Murtadha >> >> drop dataverse recovery if exists; >> create dataverse recovery; >> use dataverse recovery; >> >> /* For raw Fragile data */ >> create type FragileTypeRaw as closed { >> row_id: int32, >> sid: int32, >> date: string, >> day: int32, >> time: string, >> bpm: int32, >> RR: float >> }; >> >> /* For cleaned Fragile data */ >> create type FragileType as closed { >> row_id: int32, >> sid: int32, >> date: date, >> day: int32, >> time: time, >> bpm: int32, >> RR: float >> }; >> >> /* Create dataset for loading raw Fragile data */ >> create dataset Fragile_raw (FragileTypeRaw) >> primary key row_id; >> >> /* Create dataset for cleaned Fragile data */ >> create dataset Fragile (FragileType) >> primary key row_id; >> >> use dataverse recovery; >> >> load dataset Fragile_raw using >> "org.apache.asterix.external.dataset.adapter.NCFileSystemAdapter" >> (("path"="127.0.0.1://data/csv/fragile_01.csv"),("format"="delimited-text"),("delimiter"=",")) >> pre-sorted; >> >> use dataverse recovery; >> >> /* Load Fragile data from raw dataset into cleaned dataset */ >> insert into dataset Fragile ( >> for $t in dataset Fragile_raw >> return { >> "row_id": $t.row_id % 28000, >> "sid": $t.sid, >> "date": date($t.date), >> "day": $t.day, >> "time": parse-time($t.time, "h:m:s"), >> "bpm": $t.bpm, >> "RR": $t.RR >> } >> ); >> >> >
