Turkesh Patel (openERP) has proposed merging 
lp:~openerp-dev/openobject-addons/trunk-clean-yml-survey-tpa into 
lp:openobject-addons.

Requested reviews:
  OpenERP Core Team (openerp)

For more details, see:
https://code.launchpad.net/~openerp-dev/openobject-addons/trunk-clean-yml-survey-tpa/+merge/84223

Survey: clean the test cases
-----------------------------------

* modified survey_demo.xml 
* added survey_demo.yml
* added draft2open2close_request.yml
* survey00.yml renamed  draft2open2close_survey.yml

-- 
https://code.launchpad.net/~openerp-dev/openobject-addons/trunk-clean-yml-survey-tpa/+merge/84223
Your team OpenERP R&D Team is subscribed to branch 
lp:~openerp-dev/openobject-addons/trunk-clean-yml-survey-tpa.
=== modified file 'survey/__openerp__.py'
--- survey/__openerp__.py	2011-10-11 20:05:49 +0000
+++ survey/__openerp__.py	2011-12-02 07:41:28 +0000
@@ -48,8 +48,10 @@
                    'wizard/survey_send_invitation.xml'],
     'demo_xml': ['survey_demo.xml'],
     'test': [
-        'test/survey00.yml',
-        'test/survey_report.yml',
+        'test/draft2open2close_survey.yml',
+        'test/draft2open2close_request.yml',
+        'test/survey_demo.yml',
+        'test/survey_report.yml'
     ],
     'installable': True,
     'active': False,

=== modified file 'survey/survey_demo.xml'
--- survey/survey_demo.xml	2011-10-27 21:11:24 +0000
+++ survey/survey_demo.xml	2011-12-02 07:41:28 +0000
@@ -16,6 +16,7 @@
             <field name="tot_comp_survey">1</field>
             <field name="tot_start_survey">1</field>
             <field name="response_user">5</field>
+            <field name="note">Partener Feedback details</field>
             <field name="date_open" eval="time.strftime('%Y-%m-%d %H:%M:%S')"/>
            </record>
 

=== added file 'survey/test/draft2open2close_request.yml'
--- survey/test/draft2open2close_request.yml	1970-01-01 00:00:00 +0000
+++ survey/test/draft2open2close_request.yml	2011-12-02 07:41:28 +0000
@@ -0,0 +1,48 @@
+-
+  In order to check Survey Request I create a Survey Request for survey "OpenERP Partner Feedback".
+-
+  !record {model: survey.request, id: survey_request_1}:
+    survey_id: survey_partner_feedback
+    user_id: base.user_demo
+    state: draft
+-
+  Now I set Survey Request in waiting state.
+-
+  !python {model: survey.request}: |
+    self.survey_req_waiting_answer(cr, uid, [ref("survey_request_1")], context)
+-
+  I check that state of Survey Request is waiting.
+-
+  !assert {model: survey.request, id: survey_request_1, severity: error, string: Survey Request should be in witing state}:
+    - state == 'waiting_answer'
+-
+  I set Survey Request in cancel state.
+-
+  !python {model: survey.request}: |
+    self.survey_req_cancel(cr, uid, [ref("survey_request_1")], context)
+-
+  I check that state of Survey Request is cancel.
+-
+  !assert {model: survey.request, id: survey_request_1, severity: error, string: Survey Request should be in cancel state}:
+    - state == 'cancel'
+-
+  I set Survey Request in draft state.
+-
+  !python {model: survey.request}: |
+    self.survey_req_draft(cr, uid, [ref("survey_request_1")], context)
+-
+  I check that state of Survey request is draft.
+-
+  !assert {model: survey.request, id: survey_request_1, severity: error, string: Survey Request should be in draft state}:
+    - state == 'draft'
+-
+  I set Survey Request in done state.
+-
+  !python {model: survey.request}: |
+    self.survey_req_waiting_answer(cr, uid, [ref("survey_request_1")], context)
+    self.survey_req_done(cr, uid, [ref("survey_request_1")], context)
+-
+  I check that state of Survey request is done.
+-
+  !assert {model: survey.request, id: survey_request_1, severity: error, string: Survey Request should be in done state}:
+    - state == 'done'
\ No newline at end of file

=== renamed file 'survey/test/survey00.yml' => 'survey/test/draft2open2close_survey.yml'
--- survey/test/survey00.yml	2011-11-09 18:12:56 +0000
+++ survey/test/draft2open2close_survey.yml	2011-12-02 07:41:28 +0000
@@ -1,159 +1,96 @@
--   |
-    Survey Scenario:
-    In order to check the survey module in OpenERP.
--  
-    I Create the one survey and give survey title "Partner Feedback" and define the survey pages and survey question.
--
-    I Create "Partner Feedback" survey.
--
-    !record {model: 'survey', id: survey_partner_0}:
-        title: 'Partner Feedback'
-        max_response_limit: 20
--
-    I Create "Who are you?" page in "Partner Feedback" survey with title .
--
-    !record {model: 'survey.page', id: survey_partner_page_0}:
-        title: 'Who are you?'
-        survey_id: survey_partner_0
--
-    I Create "What is your company name?" question in "Who are you" survey page.
--
-    !record {model: 'survey.question', id: survey_p_question_0}:
-        question: 'What is your company name?'
-        type: single_textbox
-        sequence: 1
-        page_id: survey_partner_page_0
--
-    I Create "What is your company size?" question in "Who are you" survey page.
--
-    !record {model: 'survey.question', id: survey_p_question_1}:
-        question: 'What is your company size?'
-        type: multiple_choice_only_one_ans
-        sequence: 2
-        is_require_answer: true
-        page_id: survey_partner_page_0
--
-    I Create "1-50" answer in question "What is your company size?"
--
-    !record {model: 'survey.answer', id: survey_p_1_1}:
-        answer: '1 - 50'
-        sequence: 1
-        question_id : survey_p_question_1
--
-    I Create "51 - 100" answer in question "What is your company size?"
--
-    !record {model: 'survey.answer', id: survey_p_1_2}:
-        answer: '51 - 100'
-        sequence: 2
-        question_id : survey_p_question_1
--
-    I Create "100 - 500" answer in question "What is your company size?"
--
-    !record {model: 'survey.answer', id: survey_p_1_3}:
-        answer: '100 - 500'
-        sequence: 3
-        question_id : survey_p_question_1
--
-    I Create "500 - 1000" answer in question "What is your company size?"
--
-    !record {model: 'survey.answer', id: survey_p_1_4}:
-        answer: '500 - 1000'
-        sequence: 4
-        question_id : survey_p_question_1
--
-    I Create "> 1000" answer in question "What is your company size?"
--
-    !record {model: 'survey.answer', id: survey_p_1_5}:
-        answer: '> 1000'
-        sequence: 5
-        question_id : survey_p_question_1
--
-    I Create another "Contract" page in "Partner Feedback" survey.
--
-    !record {model: 'survey.page', id: survey_partner_page_1}:
-        title: 'Contract'
-        survey_id: survey_partner_0
--
-    I Create "Which maintenance contract do you sell to your customers." question in "Contract" survey page.
--
-    !record {model: 'survey.question', id: survey_p_question_3}:
-        question: 'Which maintenance contract do you sell to your customers.'
-        type: multiple_choice_only_one_ans
-        sequence: 1
-        page_id: survey_partner_page_1
--
-    I Create "OpenERP maintenance contract" answer in question "Which maintenance contract do you sell to your customers."
--
-    !record {model: 'survey.answer', id: survey_p_3_1}:
-        answer: 'OpenERP maintenance contract'
-        sequence: 1
-        question_id : survey_p_question_3
--
-    I Create "Your own contract, but you buy an OpenERP one" answer in question "Which maintenance contract do you sell to your customers."
--
-    !record {model: 'survey.answer', id: survey_p_3_2}:
-        answer: 'Your own contract, but you buy an OpenERP one'
-        sequence: 2
-        question_id : survey_p_question_3
--
-    I Create "Your own contract without buying an OpenERP one" answer in question "Which maintenance contract do you sell to your customers."
--
-    !record {model: 'survey.answer', id: survey_p_3_3}:
-        answer: 'Your own contract without buying an OpenERP one'
-        sequence: 3
-        question_id : survey_p_question_3
--
-    I Create "When do you propose a maintenance contract to your customers?" question in "Contract" survey page.
--
-    !record {model: 'survey.question', id: survey_p_question_4}:
-        question: When do you propose a maintenance contract to your customers?
-        type: multiple_choice_only_one_ans
-        sequence: 2
-        comment_field_type: text
-        comment_label: Why?
-        is_require_answer: true
-        is_comment_require: true
-        page_id: survey_partner_page_1
--
-    I Create "With each integration" answer in question "When do you propose a maintenance contract to your customers?"
--
-    !record {model: 'survey.answer', id: survey_p_4_1}:
-        answer: 'With each integration'
-        sequence: 1
-        question_id : survey_p_question_4
--
-    I Create "Sometimes" answer in question "When do you propose a maintenance contract to your customers?"
--
-    !record {model: 'survey.answer', id: survey_p_4_2}:
-        answer: 'Sometimes'
-        sequence: 2
-        question_id : survey_p_question_4
--
-    I Create "Never... " answer in question "When do you propose a maintenance contract to your customers?"
--
-    !record {model: 'survey.answer', id: survey_p_4_3}:
-        answer: 'Never... '
-        sequence: 3
-        question_id : survey_p_question_4
--
-    Now Survey set in open state.
--
-    !python {model: survey}: |
-        self.survey_open(cr, uid, [ref("survey_partner_0")], context)
--
-    Give answer of the survey, Run "Answer a Survey" wizard and select the survey and press on start button then run the selected survey.
--
-    !python {model: survey.name.wiz}: |
-      id = self.create(cr, uid, {'survey_id': ref("survey_partner_0")}) 
-      self.action_next(cr, uid, [id], context)
--
-    Give answer of the first and second page in "Partner Feedback" survey.
--
-    !python {model: survey.question.wiz}: |
-      ids = self.create(cr, uid, {str(ref("survey_p_question_0")) +"_single" :'Tiny' , str(ref("survey_p_question_1")) + "_selection" :int(ref("survey_p_1_1"))}, context)
-      ids = self.create(cr, uid, {str(ref("survey_p_question_3")) +"_selection" : int(ref("survey_p_3_1")), str(ref("survey_p_question_4")) +"_selection": int(ref("survey_p_4_1"))},context)
--
-    Set the value in "Total start survey" field.
--
-    !python {model: survey}: |
-      ids = self.write(cr, uid, ref("survey_partner_0"), {'tot_start_survey' : 1}, context)
+-
+  In order to check the survey module in OpenERP.
+-
+  Now I set "OpenERP Partner Feedback" in open state.
+-
+  !python {model: survey}: |
+    self.survey_open(cr, uid, [ref("survey_partner_feedback")], context)
+-
+  I check state of Survey is open.
+-
+  !assert {model: survey, id: survey_partner_feedback, severity: error, string: Survey should be in open state}:
+    - state == 'open'
+-
+  I find a mistake on survey. So I cancel the survey and again start it.
+-
+  !python {model: survey}: |
+    self.survey_cancel(cr, uid, [ref('survey_partner_feedback')],context)
+    self.survey_open(cr, uid, [ref('survey_partner_feedback')],context)
+-
+  I check that state of Survey is open.
+-
+  !assert {model: survey, id: survey_partner_feedback, severity: error, string: Survey should be in open state}:
+    - state == 'open'
+-
+  Now I set "OpenERP Partner Feedback" in open state.
+-
+  !python {model: survey}: |
+    self.survey_open(cr, uid, [ref("survey_partner_feedback")], context)
+-
+  I print the survey.
+-
+  !python {model: survey.print}: |
+    id = self.create(cr, uid, {'survey_ids': [(6,0,[ref('survey.survey_partner_feedback')])]})
+    self.action_next(cr, uid, [id], context)
+-
+  Give answer of the survey, Run "Answer a Survey" wizard and select the survey and press on start button then run the selected survey.
+-
+  !python {model: survey.name.wiz}: |
+    id = self.create(cr, uid, {'survey_id': ref("survey_partner_feedback")})
+    self.action_next(cr, uid, [id], context)
+-
+  Give answer of the first and second page in "Partner Feedback" survey.
+-
+  !python {model: survey.question.wiz}: |
+    ctx = {'active_model':'survey', 'active_id': ref('survey_partner_feedback'), 'active_ids': [ref('survey_partner_feedback')]}
+    self.fields_view_get(cr, uid, ref("survey.view_survey_question_message"),"form", context=ctx)
+    values = self.default_get(cr, uid, ['name'], ctx)
+    ids = self.create(cr, uid, {str(ref("survey_question_company_name")) +"_single" :'Tiny' ,
+          str(ref("survey_question_company_size")) + "_selection" : int(ref("survey.survey_question_company_size_51")),
+          str(ref("survey_question_located")) + "_selection" : int(ref("survey.survey_question_company_size_Asia")),
+          str(ref("survey_question_offical_partner")) + "_selection" : int(ref("survey.survey_question_company_size_2009")),
+          str(ref("survey_question_your_customers")) + "_selection" : int(ref("survey.survey_answer_saas_services")),
+          }, context)
+    self.action_next(cr, uid, [ids], context)
+    ids = self.create(cr, uid, {
+          str(ref("survey_answer_Ambiguous")) + "_selection" : int(ref("survey_answer_Clear")),
+          str(ref("survey_question_contract_customers")) + "_selection" : int(ref("survey_answer_sometimes")),
+          str(ref("survey_question_sell_to_your_customers")) + "_selection" : int(ref("survey_answer_maintenance_contract")),
+          }, context)
+    self.action_next(cr, uid, [ids], context)
+-
+  See the survey answers.
+-
+  !python {model: survey.browse.answer}: |
+    id = self.create(cr, uid, {'survey_id': ref('survey.survey_partner_feedback')})
+    self.action_next(cr, uid, [id], context)
+-
+  I print the survey answers.
+-
+  !python {model: survey.print.answer}: |
+    id = self.create(cr, uid, {'response_ids': [(6,0,[ref('survey.survey_partner_feedback')])]})
+    self.action_next(cr, uid, [id], context)
+-
+  send invitation.
+-
+  !python {model: survey.send.invitation}: |
+    context = {'active_model':'survey', 'active_id': ref('survey_partner_feedback'), 'active_ids': [ref('survey_partner_feedback')]}
+    values = self.default_get(cr, uid, ['mail_from', 'mail_subject', 'send_mail_existing', 'mail_subject_existing', 'mail', 'partner_ids', 'send_mail'], context)
+    values['mail_from'] = 'Surveyor'
+    new_id = self.create(cr, uid, values)
+    self.action_send(cr, uid, [new_id], context)
+-
+  Set the value in "Total start survey" field.
+-
+  !python {model: survey}: |
+    ids = self.write(cr, uid, ref("survey_partner_feedback"), {'tot_start_survey' : 1}, context)
+-
+  Now I set "OpenERP Partner Feedback" in close state.
+-
+  !python {model: survey}: |
+    self.survey_close(cr, uid, [ref("survey_partner_feedback")], context)
+-
+  I check that state of Survey is close.
+-
+  !assert {model: survey, id: survey_partner_feedback, severity: error, string: Survey should be in close state}:
+    - state == 'close'
\ No newline at end of file

=== added file 'survey/test/survey_demo.yml'
--- survey/test/survey_demo.yml	1970-01-01 00:00:00 +0000
+++ survey/test/survey_demo.yml	2011-12-02 07:41:28 +0000
@@ -0,0 +1,10 @@
+-
+  !record {model: survey.question, id: survey_question_located, view: False}:
+    type: multiple_choice_only_one_ans
+-
+  !record {model: survey.request, id: survey_request_0, view: False}:
+    survey_id: survey_odoo_feedback
+    user_id: base.user_demo
+-
+  !record {model: survey.name.wiz, id: survey_name_wiz, view: False}:
+    survey_id: survey_odoo_feedback
\ No newline at end of file

=== modified file 'survey/test/survey_report.yml'
--- survey/test/survey_report.yml	2011-01-14 00:11:01 +0000
+++ survey/test/survey_report.yml	2011-12-02 07:41:28 +0000
@@ -3,7 +3,7 @@
 -
   !python {model: survey}: |
     ctx={}
-    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey.survey_partner_feedback')])]})
+    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey_partner_feedback')])]})
     data_dict = {'response_ids' : [(6,0,[ref('survey.survey_partner_feedback')])], 'page_number' : True, 'without_pagebreak': True}
     from tools import test_reports
     test_reports.try_report_action(cr, uid, 'action_view_survey_print_answer',wiz_data=data_dict, context=ctx, our_module='survey')
@@ -13,7 +13,7 @@
 -
   !python {model: survey}: |
     ctx={}
-    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey.survey_partner_feedback')])]})
+    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey_partner_feedback')])]})
     data_dict = {'survey_ids' : [(6,0,[ref('survey.survey_partner_feedback')])]}
     from tools import test_reports
     test_reports.try_report_action(cr, uid, 'action_view_survey_print_statistics',wiz_data=data_dict, context=ctx, our_module='survey')
@@ -23,7 +23,7 @@
 -
   !python {model: survey}: |
     ctx={}
-    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey.survey_partner_feedback')])]})
+    ctx.update({'model': 'survey','active_ids': [(6,0,[ref('survey_partner_feedback')])]})
     data_dict = {'survey_ids' : [(6,0,[ref('survey.survey_partner_feedback')])], 'page_number' : True, 'without_pagebreak': True}
     from tools import test_reports
     test_reports.try_report_action(cr, uid, 'action_view_survey_print',wiz_data=data_dict, context=ctx, our_module='survey')

_______________________________________________
Mailing list: https://launchpad.net/~openerp-dev-gtk
Post to     : [email protected]
Unsubscribe : https://launchpad.net/~openerp-dev-gtk
More help   : https://help.launchpad.net/ListHelp

Reply via email to