[ 
https://issues.apache.org/jira/browse/RNG-176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17536692#comment-17536692
 ] 

Alex Herbert commented on RNG-176:
----------------------------------

{quote}Please make a prominent note in the sources that those codes are 
redundant (and should be removed at the next opportunity).
{quote}
I started doing this but the justification of linkage errors seems weak. I 
think we can drop the redundant methods.

A user should only have to include the client-api, simple and/or sampling 
modules. The sampling module only depends on the client-api. It can function 
with client-api version 1.0. Here a linkage error will not occur if they create 
their instance of an RNG without using the simple module.

The simple module must be matched in version to the core module, otherwise the 
latest generators for the version of simple cannot be created. This is if an 
incremental release has added samplers. This is the case for 1.4 to 1.5. Errors 
will depend on what the calling code is doing. If using an existing enum then 
there should be no issue as 1.5 should be able to create a functioning RNG for 
any generator in 1.4. If listing all enums using RandomSource.values() then 
some instances will throw a class not found error if the core version does not 
have the implementations.

The core module should be brought in as a transitive dependency as the correct 
(matching) version. If not then the user should correct this and specify 
matching versions for each module.

The question is whether we support binary compatibility (i.e. the ability to 
upgrade with no code changes) with mismatched jar versions of the simple module 
and the client-api. This may occur where a user is required to provide a 
UniformRandomProvider to another library. This library only specifies 
client-api v1.0. So the user includes the latest version of simple to be able 
to create one.

I tried this and maven resolves this dependency tree:
{noformat}
com.abc:lib:jar:1.0
\- org.apache.commons:commons-rng-client-api:jar:1.0:compile

com.abc:client:jar:1.0
+- org.apache.commons:commons-rng-simple:jar:1.5-SNAPSHOT:compile
|  \- org.apache.commons:commons-rng-core:jar:1.5-SNAPSHOT:compile
\- com.abc:lib:jar:1.0:compile
   \- org.apache.commons:commons-rng-client-api:jar:1.0:compile
{noformat}
So here maven picks 1.0 for the client-api as it is the closest transitive 
dependency. But if methods are removed from core 1.5 it then relies on 
client-api 1.5 for some default implementations. So a linkage error will occur.

Currently the POM for simple only includes core. If I add the client API to the 
POM for simple (since client-api and core must have matching versions) then 
this is resolved as:
{noformat}
com.abc:client:jar:1.0
+- org.apache.commons:commons-rng-simple:jar:1.5-SNAPSHOT:compile
|  +- org.apache.commons:commons-rng-client-api:jar:1.5-SNAPSHOT:compile
|  \- org.apache.commons:commons-rng-core:jar:1.5-SNAPSHOT:compile
\- com.abc:lib:jar:1.0:compile
{noformat}
This has solved the problem.

So it seems that the multi-module packaging imposes these restrictions:
 # Require that the core and client-api modules versions be in sync
 # Require that the simple, core and client-api module versions be in sync

The solution would be:
 # Add a dependency on the client-api module in the simple module to satisfy 
restriction 2
 # Remove the redundant methods from the core module that have defaults in the 
client-api. These jars must be in sync and so no linkage errors can occur with 
missing methods
 # Fix the binary compatibility check to allow this

I have done 1 in master as this seems sensible. For example there are 
interfaces in the client-api that are used in the simple module such as 
JumpableUniformRandomProvider that appears after v1.0 (in this case 1.3). It 
seems sensible to avoid a transitive dependency and use an explicit one.

I have added revapi to master as an additional binary compatibility check tool. 
This tool allows the changes under discussion. This tool detects only one issue 
with the current build and this is in an internal package. I have added this as 
a noted api difference to ignore. When the CI builds have verified master is OK 
then I will try updating the PR by removing the redundant methods from the core 
module. This will require japicmp is disabled for this module in this release 
cycle (as it compares the current build with the previous release).

 

> Enhance the UniformRandomProvider interface with extra methods and default 
> implementations
> ------------------------------------------------------------------------------------------
>
>                 Key: RNG-176
>                 URL: https://issues.apache.org/jira/browse/RNG-176
>             Project: Commons RNG
>          Issue Type: New Feature
>    Affects Versions: 1.4
>            Reporter: Alex Herbert
>            Assignee: Alex Herbert
>            Priority: Major
>          Time Spent: 20m
>  Remaining Estimate: 0h
>
> JDK 17 introduced the {{RandomGenerator}} interface with the following 
> methods:
> {code:java}
> DoubleStream doubles();
> DoubleStream doubles(double randomNumberOrigin, double randomNumberBound);
> DoubleStream doubles(long streamSize);
> DoubleStream doubles(long streamSize, double randomNumberOrigin,
>                      double randomNumberBound);
> IntStream ints();
> IntStream ints(int randomNumberOrigin, int randomNumberBound);
> IntStream ints(long streamSize);
> IntStream ints(long streamSize, int randomNumberOrigin,
>                int randomNumberBound);
> LongStream longs();
> LongStream longs(long randomNumberOrigin, long randomNumberBound);
> LongStream longs(long streamSize);
> LongStream longs(long streamSize, long randomNumberOrigin,
>                  long randomNumberBound);
> boolean nextBoolean();
> void nextBytes(byte[] bytes);
> float nextFloat();
> float nextFloat(float bound);
> float nextFloat(float origin, float bound);
> double nextDouble();
> double nextDouble(double bound);
> double nextDouble(double origin, double bound);
> int nextInt();
> int nextInt(int bound);
> int nextInt(int origin, int bound);
> long nextLong();
> long nextLong(long bound);
> long nextLong(long origin, long bound);
> double nextGaussian();
> double nextGaussian(double mean, double stddev);
> double nextExponential();
> {code}
> The only method that is *non-default* is {{{}nextLong{}}}. This allows a new 
> generator to be simply implemented by providing the source of randomness as 
> 64-bit longs.
> The {{UniformRandomProvider}} interface can be expanded to include these 
> generation methods. Using Java 8 default interface methods will not require 
> any changes to generators currently implementing the interface.
> I propose to:
>  # Add the new methods for streams and numbers in a range.
>  # Add default implementations of the current API. These can be extracted 
> from the  o.a.c.rng.core.BaseProvider implementations.
>  # Remove the implementations in o.a.c.rng.core.BaseProvider. This change 
> would be binary compatible.
> The base classes in commons core for 32-bit and 64-bit sources of randomness, 
> IntProvider and LongProvider, can be updated suitably to only override the 
> default interface methods where they can be more efficiently implemented 
> given the source of randomness. This applies to:
> ||Source||Update||Details||
> |int|nextBytes|Use nextInt() for the source of bytes|
> | |nextBoolean|Use a cached int for the randomness|
> | |nextInt|Directly supply the int rather than using 32-bits from nextLong()|
> | |nextDouble|Optimise the bits used from two ints for the 53-bits required 
> for the double.|
> |long|nextInt; nextBoolean|Use a cached long for the randomness|
> h3. Note 1
> The UniformRandomProvider also has the method:
> {code:java}
> void nextBytes(byte[] bytes,
>                int start,
>                int len);
> {code}
> This can also have a default implementation using the output from nextLong().
> h3. Note 2
> The methods to generate an exponential and Gaussian are already implemented 
> in the {{commons-rng-sampling}} module.
> java.util.Random has a nextGaussian() method and so this method appears to be 
> for backward compatibility with legacy Java code. The method is implemented 
> using a modified Ziggurat sampler which uses an exponential sampler for the 
> long tail. The API has thus exposed the exponential sampling method that is 
> used internally in the nextGaussian implementation.
> With no backward compatibility requirements the Commons RNG interface can 
> avoid the distribution sampling methods. Users should select an appropriate 
> sampler from the sampling module.
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to