From: "Artem V. Andreev" <artem.andr...@oktetlabs.ru> Callback to calculate required memory area size may require mempool driver data to be already allocated and initialized.
Signed-off-by: Artem V. Andreev <artem.andr...@oktetlabs.ru> Signed-off-by: Andrew Rybchenko <arybche...@solarflare.com> Acked-by: Santosh Shukla <santosh.shu...@caviumnetworks.com> Acked-by: Olivier Matz <olivier.m...@6wind.com> --- v3 -> v4: - rebase v2 -> v3: - none v1 -> v2: - add init check to mempool_ops_alloc_once() - move ealier in the patch series since it is required when driver ops are called and it is better to have it before new ops are added RFCv2 -> v1: - rename helper function as mempool_ops_alloc_once() lib/librte_mempool/rte_mempool.c | 33 ++++++++++++++++++++++++++------- 1 file changed, 26 insertions(+), 7 deletions(-) diff --git a/lib/librte_mempool/rte_mempool.c b/lib/librte_mempool/rte_mempool.c index d9c09e1..b15b79b 100644 --- a/lib/librte_mempool/rte_mempool.c +++ b/lib/librte_mempool/rte_mempool.c @@ -346,6 +346,21 @@ rte_mempool_free_memchunks(struct rte_mempool *mp) } } +static int +mempool_ops_alloc_once(struct rte_mempool *mp) +{ + int ret; + + /* create the internal ring if not already done */ + if ((mp->flags & MEMPOOL_F_POOL_CREATED) == 0) { + ret = rte_mempool_ops_alloc(mp); + if (ret != 0) + return ret; + mp->flags |= MEMPOOL_F_POOL_CREATED; + } + return 0; +} + /* Add objects in the pool, using a physically contiguous memory * zone. Return the number of objects added, or a negative value * on error. @@ -362,13 +377,9 @@ rte_mempool_populate_iova(struct rte_mempool *mp, char *vaddr, struct rte_mempool_memhdr *memhdr; int ret; - /* create the internal ring if not already done */ - if ((mp->flags & MEMPOOL_F_POOL_CREATED) == 0) { - ret = rte_mempool_ops_alloc(mp); - if (ret != 0) - return ret; - mp->flags |= MEMPOOL_F_POOL_CREATED; - } + ret = mempool_ops_alloc_once(mp); + if (ret != 0) + return ret; /* Notify memory area to mempool */ ret = rte_mempool_ops_register_memory_area(mp, vaddr, iova, len); @@ -570,6 +581,10 @@ rte_mempool_populate_default(struct rte_mempool *mp) int ret; bool force_contig, no_contig, try_contig, no_pageshift; + ret = mempool_ops_alloc_once(mp); + if (ret != 0) + return ret; + /* mempool must not be populated */ if (mp->nb_mem_chunks != 0) return -EEXIST; @@ -774,6 +789,10 @@ rte_mempool_populate_anon(struct rte_mempool *mp) return 0; } + ret = mempool_ops_alloc_once(mp); + if (ret != 0) + return ret; + /* get chunk of virtually continuous memory */ size = get_anon_size(mp); addr = mmap(NULL, size, PROT_READ | PROT_WRITE, -- 2.7.4