531651225 opened a new issue, #3454:
URL: https://github.com/apache/incubator-streampark/issues/3454

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-streampark/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### Java Version
   
   1.8
   
   ### Scala Version
   
   2.11.x
   
   ### StreamPark Version
   
   2.1.2
   
   ### Flink Version
   
   1.14.4
   
   ### deploy mode
   
   kubernetes-application
   
   ### What happened
   
   copy Application in streampark fail and log error as below in Error Exception
   
   ### Error Exception
   
   ```log
   2024-01-03 14:21:46 | ^[[34mINFO ^[[0;39m | ^[[1;33mXNIO-1 task-4^[[0;39m | 
^[[1;32morg.apache.streampark.console.base.handler.GlobalExceptionHandler^[[0;39m:54]
 Internal server error:
   org.springframework.dao.DataIntegrityViolationException:
   ### Error updating database.  Cause: java.sql.SQLException: Field 
'modify_time' doesn't have a default value
   ### The error may exist in 
org/apache/streampark/console/core/mapper/ApplicationMapper.java (best guess)
   ### The error may involve 
org.apache.streampark.console.core.mapper.ApplicationMapper.insert-Inline
   ### The error occurred while setting parameters
   ### SQL: INSERT INTO t_flink_app  ( team_id, job_type,  user_id, job_name,   
 version_id, cluster_id, flink_image, k8s_namespace, state, `release`,    
option_state,  args,  options,  resolve_order, execution_mode, 
dynamic_properties, app_type, tracking, jar, jar_check_sum, main_class,         
      description, create_time,   k8s_rest_exposed_type, k8s_pod_template, 
k8s_jm_pod_template, k8s_tm_pod_template,   resource_from, 
k8s_hadoop_integration, tags )  VALUES  ( ?, ?,  ?, ?,    ?, ?, ?, ?, ?, ?,    
?,  ?,  ?,  ?, ?, ?, ?, ?, ?, ?, ?,               ?, ?,   ?, ?, ?, ?,   ?, ?, ? 
)
   ### Cause: java.sql.SQLException: Field 'modify_time' doesn't have a default 
value
   ; Field 'modify_time' doesn't have a default value; nested exception is 
java.sql.SQLException: Field 'modify_time' doesn't have a default value
           at 
org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:251)
           at 
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:70)
           at 
org.mybatis.spring.MyBatisExceptionTranslator.translateExceptionIfPossible(MyBatisExceptionTranslator.java:91)
           at 
org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:441)
           at com.sun.proxy.$Proxy107.insert(Unknown Source)
           at 
org.mybatis.spring.SqlSessionTemplate.insert(SqlSessionTemplate.java:272)
           at 
com.baomidou.mybatisplus.core.override.MybatisMapperMethod.execute(MybatisMapperMethod.java:59)
           at 
com.baomidou.mybatisplus.core.override.MybatisMapperProxy$PlainMethodInvoker.invoke(MybatisMapperProxy.java:148)
           at 
com.baomidou.mybatisplus.core.override.MybatisMapperProxy.invoke(MybatisMapperProxy.java:89)
           at com.sun.proxy.$Proxy119.insert(Unknown Source)
           at 
com.baomidou.mybatisplus.extension.service.IService.save(IService.java:63)
           at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.save(ApplicationServiceImpl.java:748)
           at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.copy(ApplicationServiceImpl.java:816)
   ```
   
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!(您是否要贡献这个PR?)
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to