Hello,
As I have mentioned to others, but do so again in case you have not seen it, you will probably have much better luck with Bacula 9.6.4. However, you will need to remove the libs3-dev and libs3-2 (or what ever they are) from your Ubuntu system, load the new libs3 source code (document in ReleaseNotes for Bacula 9.6.4) and build it from source.
I have successfully tested the building and running of the 9.6.3
code on Ubuntu 18.04 and Ubuntu 20.04.
Best regards,
Kern
On 6/1/20 5:48 PM, esca wrote:
Hello,
Im trying to setup bacula with S3 driver(using s3 from Scaleway ) on a brand new ubuntu 18.04.04 LTS vm but the sd-daemon always crash
I have installed the packages directly from bacula.org
# apt-cache policy bacula-cloud-storage
bacula-cloud-storage:
Installed: 9.6.3-1
Candidate: 9.6.3-1
Version table:
*** 9.6.3-1 500
my director conf is:
Storage {
Name = SCW_S3_PAR
Address = dev-vm05.mydomain # N.B. Use a fully qualified name here
SDPort = 9103
Password = "some strings"
Device = SCW_S3_PAR
Media Type = CloudType
Maximum Concurrent Jobs = 6
}
and my bacula-sd.conf
evice {
Name = SCW_S3_PAR
Device Type = Cloud
Cloud = SCW_S3_PAR
Archive Device = /opt/bacula/backups
Maximum Part Size = 10 GB
Media Type = CloudType
LabelMedia = yes
Random Access = yes
AutomaticMount = yes
RemovableMedia = no
AlwaysOpen = no
}
Cloud {
Name = SCW_S3_PAR
Driver = "S3"
HostName = "s3.fr-par.scw.cloud"
BucketName = "bacula"
AccessKey = "myaccesskey"
SecretKey = "mysecretkey"
Protocol = HTTPS
UriStyle = Path
Truncate Cache = AfterUpload
Upload = EachPart
Region = "fr-par"
MaximumUploadBandwidth = 500MB/s
}
All services start just fine but when I start a job the Storage Daemon crash
dev-vm05:/opt/bacula# bin/bacula-sd -d100 -dt -v -f
01-Jun-2020 17:26:11 bacula-sd: address_conf.c:274-0 Initaddr 0.0.0.0:9103
01-Jun-2020 17:26:11 dev-vm05: jcr.c:131-0 read_last_jobs seek to 192
01-Jun-2020 17:26:11 dev-vm05: jcr.c:138-0 Read num_items=1
01-Jun-2020 17:26:11 dev-vm05: plugins.c:97-0 load_plugins
01-Jun-2020 17:26:11 dev-vm05: plugins.c:133-0 Rejected plugin: want=-sd.so name=bacula-sd-cloud-driver-9.6.3.so len=31
01-Jun-2020 17:26:11 dev-vm05: plugins.c:133-0 Rejected plugin: want=-sd.so name=bpipe-fd.so len=11
01-Jun-2020 17:26:11 dev-vm05: plugins.c:121-0 Failed to find any plugins in /opt/bacula/plugins
01-Jun-2020 17:26:11 dev-vm05: stored.c:613-0 calling init_dev SCW_S3_PAR
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:152-0 Num drivers=15
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:165-0 loadable=1 type=14 loaded=0 name=cloud handle=0
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:432-0 loadable=1 type=14 loaded=0 name=cloud handle=0
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:437-0 Open SD driver at /opt/bacula/plugins/bacula-sd-cloud-driver-9.6.3.so
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:440-0 Driver=cloud handle=7f04440014b0
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:442-0 Lookup "BaculaSDdriver" in driver=cloud
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:444-0 Driver=cloud entry point=7f044a4e4f30
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:212-0 init_dev allocated: 7f0444001af8
01-Jun-2020 17:26:11 dev-vm05: init_dev.c:393-0 init_dev: tape=0 dev_name=/opt/bacula/backups
01-Jun-2020 17:26:11 dev-vm05: dev.c:1101-0 DEVICE::register_metrics called. 0x7f0444001af8 collector=0x560259842328
01-Jun-2020 17:26:11 dev-vm05: stored.c:615-0 SD init done SCW_S3_PAR (0x7f0444001af8)
01-Jun-2020 17:26:11 dev-vm05: acquire.c:671-0 Attach 0x44003b48 to dev "SCW_S3_PAR" (/opt/bacula/backups)
01-Jun-2020 17:26:11 dev-vm05: bnet_server.c:86-0 Addresses 0.0.0.0:9103
01-Jun-2020 17:26:21 dev-vm05: bsock.c:851-0 socket=8 who=client host=127.0.0.1 port=47660
01-Jun-2020 17:26:21 dev-vm05: dircmd.c:190-0 Got a DIR connection at 01-Jun-2020 17:26:21
01-Jun-2020 17:26:21 dev-vm05: cram-md5.c:69-0 send: auth cram-md5 challenge <637208849.1591025181@dev-vm05> ssl=0
01-Jun-2020 17:26:21 dev-vm05: cram-md5.c:133-0 cram-get received: auth cram-md5 <849613792.1591025181@dev-vm05-dir> ssl=0
01-Jun-2020 17:26:21 dev-vm05: cram-md5.c:157-0 sending resp to challenge: Kz/xCmg1X++/Im+Lj4Nj0A
01-Jun-2020 17:26:21 dev-vm05: dircmd.c:216-0 Message channel init completed.
01-Jun-2020 17:26:21 dev-vm05: dircmd.c:1169-0 Found device SCW_S3_PAR
01-Jun-2020 17:26:21 dev-vm05: dircmd.c:1213-0 Found device SCW_S3_PAR
01-Jun-2020 17:26:21 dev-vm05: acquire.c:671-0 Attach 0x44003d98 to dev "SCW_S3_PAR" (/opt/bacula/backups)
Bacula interrupted by signal 11: Segmentation violation
Kaboom! bacula-sd, dev-vm05 got signal 11 - Segmentation violation at 01-Jun-2020 17:26:21. Attempting traceback.
Kaboom! exepath=/opt/bacula/bin/
Calling: /opt/bacula/bin/btraceback /opt/bacula/bin/bacula-sd 25133 /opt/bacula/working
bsmtp: bsmtp.c:488-0 Failed to connect to mailhost localhost
The btraceback call returned 1
LockDump: /opt/bacula/working/bacula.25133.traceback
01-Jun-2020 17:26:22 dev-vm05: lockmgr.c:1221-0 lockmgr disabled
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 536 bytes at 560259848de8 from jcr.c:386
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 280 bytes at 560259844368 from jcr.c:390
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 280 bytes at 560259844618 from jcr.c:384
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 154 bytes at 560259842138 from lib/mem_pool.h:84
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 536 bytes at 560259845198 from bsockcore.c:157
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 280 bytes at 7f04440011c8 from jcr.c:388
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 280 bytes at 7f0444001318 from dircmd.c:790
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 448 bytes at 560259852658 from bsock.c:852
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 4120 bytes at 560259849818 from bsockcore.c:156
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 4120 bytes at 56025984a868 from bsock.c:101
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 7 bytes at 560259846ec8 from bsock.c:854
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 10 bytes at 560259851a18 from bsock.c:855
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 16 bytes at 5602598463b8 from workq.c:198
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 24 bytes at 7f0444000f28 from jcr.c:372
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 32 bytes at 7f0444000f78 from dircmd.c:194
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 24 bytes at 7f0444000fd8 from askdir.c:575
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 536 bytes at 7f0444003b48 from record_util.c:251
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 1280 bytes at 7f0444003d98 from acquire.c:640
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 32 bytes at 7f0444001118 from acquire.c:643
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 32 bytes at 7f04440042d8 from acquire.c:644
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 168 bytes at 7f0444004338 from block_util.c:146
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 64536 bytes at 7f0444004418 from block_util.c:163
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 64536 bytes at 7f0444014068 from block_util.c:164
01-Jun-2020 17:26:22 dev-vm05: smartall.c:411-0 Orphaned buffer: dev-vm05 184 bytes at 7f0444023cb8 from record_util.c:249
The output of status storage:
*status storage
Automatically selected Storage: SCW_S3_PAR
Connecting to Storage daemon SCW_S3_PAR at dev-vm05.mydomain:9103
dev-vm05.mydomain: 9.6.3 (09 March 2020) x86_64-pc-linux-gnu ubuntu 18.04
Daemon started 01-Jun-20 17:43. Jobs: run=0, running=0.
Heap: heap=110,592 smbytes=33,498 max_bytes=154,292 bufs=143 max_bufs=144
Sizes: boffset_t=8 size_t=8 int32_t=4 int64_t=8 mode=0,0 newbsr=0
Res: ndevices=1 nautochgr=1
Drivers: cloud
Running Jobs:
No Jobs running.
====
Jobs waiting to reserve a drive:
====
Terminated Jobs:
====
Device status:
Autochanger "FileChgr2" with devices:
"SCW_S3_PAR" (/opt/bacula/backups)
Device Cloud: "SCW_S3_PAR" (/opt/bacula/backups) is not open.
Drive 0 is not loaded.
Configured device capabilities:
EOF BSR BSF FSR FSF EOM !REM RACCESS AUTOMOUNT LABEL !ANONVOLS !ALWAYSOPEN !SYNCONCLOSE
Device state:
!OPENED !TAPE !LABEL !APPEND !READ !EOT !WEOT !EOF !WORM !NEXTVOL !SHORT !MOUNTED !MALLOC
Writers=0 reserves=0 blocked=0 enabled=1 usage=0
Attached JobIds:
Device parameters:
Archive name: /opt/bacula/backups Device name: SCW_S3_PAR
File=0 block=0
Min block=0 Max block=0
Available Cache Space=26.32 GB
==
====
Cloud transfer status:
Uploads (0 B/s) (ETA 0 s) Queued=0 0 B, Processed=0 0 B, Done=0 0 B, Failed=0 0 B
Downloads (0 B/s) (ETA 0 s) Queued=0 0 B, Processed=0 0 B, Done=0 0 B, Failed=0 0 B
====
Used Volume status:
====
and the dump:
[New LWP 25135]
[New LWP 25163]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
0x00007f044cd5f03f in __GI___select (nfds=nfds@entry=8, readfds=readfds@entry=0x7ffe53e35a20, writefds=writefds@entry=0x0, exceptfds=exceptfds@entry=0x0, timeout=timeout@entry=0x0) at ../sysdeps/unix/sysv/linux/select.c:41
41 ../sysdeps/unix/sysv/linux/select.c: No such file or directory.
$1 = "01-Jun-2020 17:26:21\000\000\000\000\000\000\000\000\000"
$2 = 0x5602597e2ee0 <my_name> "dev-vm05"
$3 = 0x5602598430e8 "bacula-sd"
$4 = 0x560259843128 "/opt/bacula/bin/bacula-sd"
$5 = 0x0
$6 = '\000' <repeats 49 times>
$7 = 0x7f044d64255b "9.6.3 (09 March 2020)"
$8 = 0x7f044d64253a "x86_64-pc-linux-gnu"
$9 = 0x7f044d642533 "ubuntu"
$10 = 0x7f044d642555 "18.04"
$11 = "dev-vm05", '\000' <repeats 30 times>
$12 = 0x7f044d64254e "ubuntu 18.04"
Environment variable "TestName" not defined.
#0 0x00007f044cd5f03f in __GI___select (nfds=nfds@entry=8, readfds=readfds@entry=0x7ffe53e35a20, writefds=writefds@entry=0x0, exceptfds=exceptfds@entry=0x0, timeout=timeout@entry=0x0) at ../sysdeps/unix/sysv/linux/select.c:41
#1 0x00007f044d5fe618 in bnet_thread_server (addrs=<optimized out>, max_clients=41, client_wq=0x5602597e3020 <dird_workq>, handle_client_request=0x5602595cbee0 <handle_connection_request(void*)>) at bnet_server.c:166
#2 0x00005602595c326a in main (argc=<optimized out>, argv=<optimized out>) at stored.c:326
Thread 3 (Thread 0x7f044b6f4700 (LWP 25163)):
#0 0x00007f044d3d423a in __waitpid (pid=pid@entry=25164, stat_loc=stat_loc@entry=0x7f044b6f332c, options=options@entry=0) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1 0x00007f044d62953e in signal_handler (sig=11) at signal.c:233
#2 <signal handler called>
#3 0x00007f044a4e9787 in cloud_dev::get_cloud_volumes_list (this=<optimized out>, dcr=0x7f0444003d98, volumes=0x7f044b6f3c50, err=@0x7f044b6f3c48: 0x7f0444001330 "") at cloud_dev.h:110
#4 0x00005602595c7829 in cloud_list_cmd (jcr=<optimized out>) at dircmd.c:815
#5 0x00005602595cc394 in handle_connection_request (arg=0x560259852658) at dircmd.c:242
#6 0x00007f044d634518 in workq_server (arg=0x5602597e3020 <dird_workq>) at workq.c:372
#7 0x00007f044d3c96db in start_thread (arg=0x7f044b6f4700) at pthread_create.c:463
#8 0x00007f044cd6988f in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
Thread 2 (Thread 0x7f044aef3700 (LWP 25135)):
#0 0x00007f044d3cff85 in futex_abstimed_wait_cancelable (private=<optimized out>, abstime=0x7f044aef2e90, expected=0, futex_word=0x7f044d858008 <_ZL5timer+40>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1 __pthread_cond_wait_common (abstime=0x7f044aef2e90, mutex=0x7f044d858020 <_ZL11timer_mutex>, cond=0x7f044d857fe0 <_ZL5timer>) at pthread_cond_wait.c:539
#2 __pthread_cond_timedwait (cond=cond@entry=0x7f044d857fe0 <_ZL5timer>, mutex=mutex@entry=0x7f044d858020 <_ZL11timer_mutex>, abstime=abstime@entry=0x7f044aef2e90) at pthread_cond_wait.c:667
#3 0x00007f044d633b56 in watchdog_thread (arg=<optimized out>) at watchdog.c:299
#4 0x00007f044d3c96db in start_thread (arg=0x7f044aef3700) at pthread_create.c:463
#5 0x00007f044cd6988f in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
Thread 1 (Thread 0x7f044deeb300 (LWP 25133)):
#0 0x00007f044cd5f03f in __GI___select (nfds=nfds@entry=8, readfds=readfds@entry=0x7ffe53e35a20, writefds=writefds@entry=0x0, exceptfds=exceptfds@entry=0x0, timeout=timeout@entry=0x0) at ../sysdeps/unix/sysv/linux/select.c:41
#1 0x00007f044d5fe618 in bnet_thread_server (addrs=<optimized out>, max_clients=41, client_wq=0x5602597e3020 <dird_workq>, handle_client_request=0x5602595cbee0 <handle_connection_request(void*)>) at bnet_server.c:166
#2 0x00005602595c326a in main (argc=<optimized out>, argv=<optimized out>) at stored.c:326
#0 0x00007f044cd5f03f in __GI___select (nfds=nfds@entry=8, readfds=readfds@entry=0x7ffe53e35a20, writefds=writefds@entry=0x0, exceptfds=exceptfds@entry=0x0, timeout=timeout@entry=0x0) at ../sysdeps/unix/sysv/linux/select.c:41
41 in ../sysdeps/unix/sysv/linux/select.c
resultvar = 18446744073709551102
sc_cancel_oldtype = 0
sc_ret = <optimized out>
#1 0x00007f044d5fe618 in bnet_thread_server (addrs=<optimized out>, max_clients=41, client_wq=0x5602597e3020 <dird_workq>, handle_client_request=0x5602595cbee0 <handle_connection_request(void*)>) at bnet_server.c:166
166 bnet_server.c: No such file or directory.
maxfd = 7
sockset = {fds_bits = {128, 0 <repeats 15 times>}}
newsockfd = <optimized out>
stat = <optimized out>
clilen = 16
clientaddr = {ss_family = 2, __ss_padding = "\272,\177\000\000\001\000\000\000\000\000\000\000\000(#\204Y\002V", '\000' <repeats 18 times>, "\036\000\316M\004\177\000\000\177\003", '\000' <repeats 22 times>, "\200\037\000\000\377\377", '\000' <repeats 41 times>, __ss_align = 0}
tlog = <optimized out>
turnon = 1
request = {fd = 8, user = '\000' <repeats 127 times>, daemon = "dev-vm05", '\000' <repeats 108 times>, pid = "25133\000\000\000\000", client = {{name = '\000' <repeats 127 times>, addr = '\000' <repeats 127 times>, sin = 0x7f044cc477e0, unit = 0x0, request = 0x7ffe53e35aa0}}, server = {{name = '\000' <repeats 127 times>, addr = '\000' <repeats 127 times>, sin = 0x7f044cc47760, unit = 0x0, request = 0x7ffe53e35aa0}}, sink = 0x0, hostname = 0x7f044ca43b30 <sock_hostname>, hostaddr = 0x7f044ca43ae0 <sock_hostaddr>, cleanup = 0x0, config = 0x0}
addr = <optimized out>
fd_ptr = 0x0
buf = "127.0.0.1", '\000' <repeats 118 times>
sockfds = {<SMARTALLOC> = {<No data fields>}, head = 0x7ffe53e358d0, tail = 0x7ffe53e358d0, loffset = 0, num_items = 1}
allbuf = "0.0.0.0:9103 ", '\000' <repeats 11 times>, "М^M\004\177\000\000\033p\351\003\000\000\000\000$\271\205M\004\177\000\000\300\225\205M\004\177\000\000X\031^M\004\177\000\000\020\232\357M\004\177\000\000\000\000\000\000\004\177\000\000``\343S\376\177\000\000\003\000\000\000\004\177\000\000P`\343S\376\177\000\000\000\000\000\000\376\177\000\000\330\314\356M\004\177\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\001\000\000\000\004\177\000\000\330\314\356M\004\177\000\000\301\006\\\372\000\000\000\000h\230\357M\004\177\000\000\370`\343S\376\177\000\000\000\000\000\000\000\000\000\000\020\225\357M\004\177\000\000\000\000\000\000\000\000\000\000\357"...
#2 0x00005602595c326a in main (argc=<optimized out>, argv=<optimized out>) at stored.c:326
326 stored.c: No such file or directory.
ch = <optimized out>
no_signals = <optimized out>
thid = 139656422180608
uid = 0x0
gid = 0x0
#0 0x0000000000000000 in ?? ()
No symbol table info available.
#0 0x0000000000000000 in ?? ()
No symbol table info available.
#0 0x0000000000000000 in ?? ()
No symbol table info available.
#0 0x0000000000000000 in ?? ()
No symbol table info available.
#0 0x0000000000000000 in ?? ()
No symbol table info available.
Attempt to dump current JCRs. njcrs=1
threadid=0x7f044b6f4700 JobId=0 JobStatus=C jcr=0x7f04440008f8 name=*System*
use_count=1 killable=1
JobType=I JobLevel=
sched_time=01-Jun-2020 17:26 start_time=01-Jan-1970 01:00
end_time=01-Jan-1970 01:00 wait_time=01-Jan-1970 01:00
db=(nil) db_batch=(nil) batch_started=0
dcr=*None*
List plugins. Hook count=0
Do you have any idea of whats wrong in my setup?
Thanks!
_______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users
_______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users