Uploaded image for project: 'PuppetDB'
  1. PuppetDB
  2. PDB-165

storeconfigs face excessive memory requirement

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Normal
    • Resolution: Won't Fix
    • Affects Version/s: PDB 1.5.0, PDB 2.3.8
    • Fix Version/s: None
    • Component/s: None
    • Labels:
    • Template:
    • Story Points:
      5

      Description

      I came up with a problem when trying to export my storeconfigs database prior to migrating to PuppetDB (following the steps here: http://docs.puppetlabs.com/puppetdb/latest/migrate.html#migrating-from-activerecord-storeconfigs).

      Initial conditions:

      • 8GB RAM
      • 2.2GB MySQL database
      • CentOS 6.4
      • puppetlabs-release.6-7

      Relevant section of /etc/puppet/puppet.conf:
      {{collapse
      <pre>
      [main]

      1. Where Puppet stores dynamic and growing data.
      2. The default value is '/var/puppet'.
        vardir = /var/lib/puppet
      1. The Puppet log directory.
      2. The default value is '$vardir/log'.
        logdir = /var/log/puppet
      1. Where Puppet PID files are kept.
      2. The default value is '$vardir/run'.
        rundir = /var/run/puppet
      1. Where SSL certificates are kept.
      2. The default value is '$confdir/ssl'.
        ssldir = $vardir/ssl
      1. To install custom facts and plugins.
        pluginsync = true
        factpath = $vardir/lib/facter
      1. Create graphs for all the dependencies tree in /var/lib/puppet/state/graphs/ for debuging.
        graph = true
      1. Set the template dir.
        templatedir = /etc/puppet/files
      1. Increase timeout so it doesn't get expired in some servers.
        configtimeout = 21600
      1. Always flush logs to disk.
        autoflush = true
      1. The environment Puppet is running in.
        environment = production

      facts_terminus = inventory_active_record
      storeconfigs = true
      dbadapter = mysql2
      dbserver = localhost
      dbname = puppet
      dbuser = puppet
      dbpassword = puppet
      dbsocket =
      dbmigrate = true
      dbconnections = 20
      </pre>
      }}
      BTW I also hit the problem described here:
      <pre>
      http://stackoverflow.com/questions/7243046/excessive-stat-calls-on-etc-localtime-in-rails-application and
      http://stackoverflow.com/questions/4554271/how-to-avoid-excessive-stat-etc-localtime-calls-in-strftime-on-linux
      </pre>
      Fixed by executing '# export TZ=:/etc/localtime' prior to launcing the Puppet face.

      So I launched the face. Database connection and data retrieval was successful, according to slow-queries.log:
      {{collapse
      <pre>

      1. Time: 131018 10:45:38
      2. User@Host: puppet[puppet] @ localhost []
      3. Thread_id: 1206 Schema: puppet Last_errno: 1160 Killed: 0
      4. Query_time: 64.454990 Lock_time: 0.003983 Rows_sent: 434313 Rows_examined: 909237 Rows_affected: 0 Rows_read: 18446744073709551614
      5. Bytes_sent: 165756544 Tmp_tables: 0 Tmp_disk_tables: 0 Tmp_table_sizes: 0
      6. InnoDB_trx_id: 1222
        SET timestamp=1382085938;
        SELECT `hosts`.`id` AS t0_r0, `hosts`.`name` AS t0_r1, `hosts`.`ip` AS t0_r2, `hosts`.`environment` AS t0_r3, `hosts`.`last_compile` AS t0_r4, `hosts`.`last_freshcheck` AS t0_r5, `hosts`.`last_report` AS t0_r6, `hosts`.`updated_at` AS t0_r7, `hosts`.`source_file_id` AS t0_r8, `hosts`.`created_at` AS t0_r9, `resources`.`id` AS t1_r0, `resources`.`title` AS t1_r1, `resources`.`restype` AS t1_r2, `resources`.`host_id` AS t1_r3, `resources`.`source_file_id` AS t1_r4, `resources`.`exported` AS t1_r5, `resources`.`line` AS t1_r6, `resources`.`updated_at` AS t1_r7, `resources`.`created_at` AS t1_r8, `param_values`.`id` AS t2_r0, `param_values`.`value` AS t2_r1, `param_values`.`param_name_id` AS t2_r2, `param_values`.`line` AS t2_r3, `param_values`.`resource_id` AS t2_r4, `param_values`.`updated_at` AS t2_r5, `param_values`.`created_at` AS t2_r6, `puppet_tags`.`id` AS t3_r0, `puppet_tags`.`name` AS t3_r1, `puppet_tags`.`updated_at` AS t3_r2, `puppet_tags`.`created_at` AS t3_r3 FROM `hosts` LEFT OUTER JOIN `resources` ON resources.host_id = hosts.id LEFT OUTER JOIN `param_values` ON param_values.resource_id = resources.id LEFT OUTER JOIN `resource_tags` ON (`resources`.`id` = `resource_tags`.`resource_id`) LEFT OUTER JOIN `puppet_tags` ON (`puppet_tags`.`id` = `resource_tags`.`puppet_tag_id`) WHERE (`resources`.`exported` = 1);
        </pre>
        }}
        Then, the process started consuming 100% of one core and memory usage went progressively up during several hours until it filled all RAM and swap space. And finally the process was killed due to an OOM error:
        {{collapse
        <pre>
        [root@roger-test.ofi ~]# time puppet storeconfigs --verbose --debug export
        Debug: Puppet::Type::User::ProviderUser_role_add: file roleadd does not exist
        Debug: Puppet::Type::User::ProviderPw: file pw does not exist
        Debug: Puppet::Type::User::ProviderDirectoryservice: file /usr/bin/dsimport does not exist
        Debug: Failed to load library 'ldap' for feature 'ldap'
        Debug: Puppet::Type::User::ProviderLdap: feature ldap is missing
        Debug: Using settings: adding file resource 'hostcert': 'File[/var/lib/puppet/ssl/certs/roger-test.ofi.softonic.lan.pem]{:links=>:follow, :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"644", :path=>"/var/lib/puppet/ssl/certs/roger-test.ofi.softonic.lan.pem", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'plugindest': 'File[/var/lib/puppet/lib]{:links=>:follow, :ensure=>:directory, :backup=>false, :path=>"/var/lib/puppet/lib", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'localcacert': 'File[/var/lib/puppet/ssl/certs/ca.pem]{:links=>:follow, :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"644", :path=>"/var/lib/puppet/ssl/certs/ca.pem", :loglevel=>:debug}'
        Debug: Puppet::Type::Group::ProviderPw: file pw does not exist
        Debug: Puppet::Type::Group::ProviderDirectoryservice: file /usr/bin/dscl does not exist
        Debug: Failed to load library 'ldap' for feature 'ldap'
        Debug: Puppet::Type::Group::ProviderLdap: feature ldap is missing
        Debug: Using settings: adding file resource 'railslog': 'File[/var/log/puppet/rails.log]{:links=>:follow, :group=>"puppet", :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"600", :path=>"/var/log/puppet/rails.log", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'hostcrl': 'File[/var/lib/puppet/ssl/crl.pem]{:links=>:follow, :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"644", :path=>"/var/lib/puppet/ssl/crl.pem", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'certdir': 'File[/var/lib/puppet/ssl/certs]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :path=>"/var/lib/puppet/ssl/certs", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'vardir': 'File[/var/lib/puppet]{:links=>:follow, :ensure=>:directory, :backup=>false, :path=>"/var/lib/puppet", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'requestdir': 'File[/var/lib/puppet/ssl/certificate_requests]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :path=>"/var/lib/puppet/ssl/certificate_requests", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'rundir': 'File[/var/run/puppet]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"755", :path=>"/var/run/puppet", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'rest_authconfig': 'File[/etc/puppet/auth.conf]{:links=>:follow, :ensure=>:file, :backup=>false, :path=>"/etc/puppet/auth.conf", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'yamldir': 'File[/var/lib/puppet/yaml]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/yaml", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'hostprivkey': 'File[/var/lib/puppet/ssl/private_keys/roger-test.ofi.softonic.lan.pem]{:links=>:follow, :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"600", :path=>"/var/lib/puppet/ssl/private_keys/roger-test.ofi.softonic.lan.pem", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'reportdir': 'File[/var/lib/puppet/reports]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/reports", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'statedir': 'File[/var/lib/puppet/state]{:links=>:follow, :ensure=>:directory, :backup=>false, :mode=>"1755", :path=>"/var/lib/puppet/state", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'logdir': 'File[/var/log/puppet]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/log/puppet", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'ssldir': 'File[/var/lib/puppet/ssl]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"771", :path=>"/var/lib/puppet/ssl", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'privatekeydir': 'File[/var/lib/puppet/ssl/private_keys]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/ssl/private_keys", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'server_datadir': 'File[/var/lib/puppet/server_data]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/server_data", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'hostpubkey': 'File[/var/lib/puppet/ssl/public_keys/roger-test.ofi.softonic.lan.pem]{:links=>:follow, :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"644", :path=>"/var/lib/puppet/ssl/public_keys/roger-test.ofi.softonic.lan.pem", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'manifestdir': 'File[/etc/puppet/manifests]{:links=>:follow, :ensure=>:directory, :backup=>false, :path=>"/etc/puppet/manifests", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'confdir': 'File[/etc/puppet]{:links=>:follow, :ensure=>:directory, :backup=>false, :path=>"/etc/puppet", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'masterhttplog': 'File[/var/log/puppet/masterhttp.log]{:links=>:follow, :group=>"puppet", :ensure=>:file, :backup=>false, :owner=>"puppet", :mode=>"660", :path=>"/var/log/puppet/masterhttp.log", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'publickeydir': 'File[/var/lib/puppet/ssl/public_keys]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :path=>"/var/lib/puppet/ssl/public_keys", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'bucketdir': 'File[/var/lib/puppet/bucket]{:links=>:follow, :group=>"puppet", :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/bucket", :loglevel=>:debug}'
        Debug: Using settings: adding file resource 'privatedir': 'File[/var/lib/puppet/ssl/private]{:links=>:follow, :ensure=>:directory, :backup=>false, :owner=>"puppet", :mode=>"750", :path=>"/var/lib/puppet/ssl/private", :loglevel=>:debug}'
        Debug: /File[/var/lib/puppet/reports]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/var/lib/puppet/ssl/crl.pem]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/lib/puppet/ssl/certs/ca.pem]: Autorequiring File[/var/lib/puppet/ssl/certs]
        Debug: /File[/var/lib/puppet/ssl/private]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/lib/puppet/bucket]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/var/lib/puppet/state]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/var/lib/puppet/ssl/private_keys]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/lib/puppet/lib]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/etc/puppet/auth.conf]: Autorequiring File[/etc/puppet]
        Debug: /File[/var/lib/puppet/ssl/public_keys]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/lib/puppet/ssl/public_keys/roger-test.ofi.softonic.lan.pem]: Autorequiring File[/var/lib/puppet/ssl/public_keys]
        Debug: /File[/var/lib/puppet/ssl/private_keys/roger-test.ofi.softonic.lan.pem]: Autorequiring File[/var/lib/puppet/ssl/private_keys]
        Debug: /File[/var/lib/puppet/yaml]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/etc/puppet/manifests]: Autorequiring File[/etc/puppet]
        Debug: /File[/var/lib/puppet/ssl]: Autorequiring File[/var/lib/puppet]
        Debug: /File[/var/lib/puppet/ssl/certificate_requests]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/log/puppet/rails.log]: Autorequiring File[/var/log/puppet]
        Debug: /File[/var/lib/puppet/ssl/certs]: Autorequiring File[/var/lib/puppet/ssl]
        Debug: /File[/var/lib/puppet/ssl/certs/roger-test.ofi.softonic.lan.pem]: Autorequiring File[/var/lib/puppet/ssl/certs]
        Debug: /File[/var/log/puppet/masterhttp.log]: Autorequiring File[/var/log/puppet]
        Debug: /File[/var/lib/puppet/server_data]: Autorequiring File[/var/lib/puppet]
        Debug: Finishing transaction 69924332004460
        Info: Connecting to mysql2 database: puppet
        Killed

      real 326m0.807s
      user 9m46.181s
      sys 11m29.771s
      </pre>
      }}

      Dmesg showed:
      {{collapse
      <pre>
      puppet invoked oom-killer: gfp_mask=0x200da, order=0, oom_adj=0, oom_score_adj=0
      puppet cpuset=/ mems_allowed=0
      Pid: 6477, comm: puppet Not tainted 2.6.32-358.18.1.el6.x86_64 #1
      Call Trace:
      [<ffffffff810cb641>] ? cpuset_print_task_mems_allowed+0x91/0xb0
      [<ffffffff8111ce40>] ? dump_header+0x90/0x1b0
      [<ffffffff8121d49c>] ? security_real_capable_noaudit+0x3c/0x70
      [<ffffffff8111d2c2>] ? oom_kill_process+0x82/0x2a0
      [<ffffffff8111d201>] ? select_bad_process+0xe1/0x120
      [<ffffffff8111d700>] ? out_of_memory+0x220/0x3c0
      [<ffffffff8112c3ac>] ? __alloc_pages_nodemask+0x8ac/0x8d0
      [<ffffffff81160d3a>] ? alloc_pages_vma+0x9a/0x150
      [<ffffffff81154a72>] ? read_swap_cache_async+0xf2/0x160
      [<ffffffff81155599>] ? valid_swaphandles+0x69/0x150
      [<ffffffff81154b67>] ? swapin_readahead+0x87/0xc0
      [<ffffffff81143e7b>] ? handle_pte_fault+0x70b/0xb50
      [<ffffffff811444fa>] ? handle_mm_fault+0x23a/0x310
      [<ffffffff81146e72>] ? find_vma+0x12/0x80
      [<ffffffff810474e9>] ? __do_page_fault+0x139/0x480
      [<ffffffff8112c8b9>] ? free_pages+0x49/0x50
      [<ffffffff810097cc>] ? __switch_to+0x1ac/0x320
      [<ffffffff8150e130>] ? thread_return+0x4e/0x76e
      [<ffffffff81513b6e>] ? do_page_fault+0x3e/0xa0
      [<ffffffff81510f25>] ? page_fault+0x25/0x30
      Mem-Info:
      Node 0 DMA per-cpu:
      CPU 0: hi: 0, btch: 1 usd: 0
      CPU 1: hi: 0, btch: 1 usd: 0
      Node 0 DMA32 per-cpu:
      CPU 0: hi: 186, btch: 31 usd: 30
      CPU 1: hi: 186, btch: 31 usd: 0
      Node 0 Normal per-cpu:
      CPU 0: hi: 186, btch: 31 usd: 117
      CPU 1: hi: 186, btch: 31 usd: 3
      active_anon:1654606 inactive_anon:300632 isolated_anon:32
      active_file:118 inactive_file:130 isolated_file:0
      unevictable:0 dirty:0 writeback:9 unstable:0
      free:25326 slab_reclaimable:3001 slab_unreclaimable:5537
      mapped:135 shmem:0 pagetables:8403 bounce:0
      Node 0 DMA free:15716kB min:124kB low:152kB high:184kB active_anon:0kB inactive_anon:0kB active_file:0kB inactive_file:0kB unevictable:0kB isolated(anon):0kB isolated(file):0kB present:15320kB mlocked:0kB dirty:0kB writeback:0kB mapped:0kB shmem:0kB slab_reclaimable:0kB slab_unreclaimable:0kB kernel_stack:0kB pagetables:0kB unstable:0kB bounce:0kB writeback_tmp:0kB pages_scanned:0 all_unreclaimable? yes
      lowmem_reserve[]: 0 3512 8057 8057
      Node 0 DMA32 free:47528kB min:29404kB low:36752kB high:44104kB active_anon:2715328kB inactive_anon:552096kB active_file:40kB inactive_file:0kB unevictable:0kB isolated(anon):0kB isolated(file):0kB present:3596500kB mlocked:0kB dirty:0kB writeback:0kB mapped:76kB shmem:0kB slab_reclaimable:3064kB slab_unreclaimable:24kB kernel_stack:8kB pagetables:7200kB unstable:0kB bounce:0kB writeback_tmp:0kB pages_scanned:733 all_unreclaimable? yes
      lowmem_reserve[]: 0 0 4545 4545
      Node 0 Normal free:38060kB min:38052kB low:47564kB high:57076kB active_anon:3903096kB inactive_anon:650432kB active_file:432kB inactive_file:520kB unevictable:0kB isolated(anon):128kB isolated(file):0kB present:4654080kB mlocked:0kB dirty:0kB writeback:36kB mapped:464kB shmem:0kB slab_reclaimable:8940kB slab_unreclaimable:22124kB kernel_stack:928kB pagetables:26412kB unstable:0kB bounce:0kB writeback_tmp:0kB pages_scanned:8896 all_unreclaimable? yes
      lowmem_reserve[]: 0 0 0 0
      Node 0 DMA: 3*4kB 1*8kB 1*16kB 2*32kB 2*64kB 1*128kB 0*256kB 0*512kB 1*1024kB 1*2048kB 3*4096kB = 15716kB
      Node 0 DMA32: 90*4kB 20*8kB 12*16kB 11*32kB 10*64kB 4*128kB 1*256kB 2*512kB 31*1024kB 4*2048kB 1*4096kB = 47528kB
      Node 0 Normal: 219*4kB 168*8kB 122*16kB 50*32kB 26*64kB 15*128kB 6*256kB 7*512kB 11*1024kB 2*2048kB 2*4096kB = 38028kB
      113680 total pagecache pages
      113419 pages in swap cache
      Swap cache stats: add 170900422, delete 170787003, find 60760359/74303185
      Free swap = 0kB
      Total swap = 8208376kB
      2097151 pages RAM
      82183 pages reserved
      323 pages shared
      1986100 pages non-shared
      [ pid ] uid tgid total_vm rss cpu oom_adj oom_score_adj name
      [ 510] 0 510 2727 0 0 -17 -1000 udevd
      [ 855] 0 855 2726 0 0 -17 -1000 udevd
      [ 1179] 0 1179 62286 49 0 0 0 rsyslogd
      [ 1208] 0 1208 2704 52 0 0 0 irqbalance
      [ 1295] 81 1295 5350 1 1 0 0 dbus-daemon
      [ 1308] 0 1308 34048 1 1 0 0 ruby
      [ 1338] 0 1338 1019 0 1 0 0 acpid
      [ 1360] 0 1360 49314 200 0 0 0 snmpd
      [ 1372] 0 1372 16563 0 0 -17 -1000 sshd
      [ 1401] 38 1401 7540 35 1 0 0 ntpd
      [ 1417] 0 1417 27050 1 0 0 0 mysqld_safe
      [ 1607] 498 1607 985537 2421 0 0 0 mysqld
      [ 1636] 99 1636 76539 835 0 0 0 gmond
      [ 1644] 497 1644 10247 11 1 0 0 nrpe
      [ 1722] 0 1722 20335 34 0 0 0 master
      [ 1736] 89 1736 20398 16 0 0 0 qmgr
      [ 1746] 0 1746 27544 1 1 0 0 abrtd
      [ 1754] 0 1754 29313 23 0 0 0 crond
      [ 1770] 0 1770 1015 1 0 0 0 mingetty
      [ 1771] 0 1771 1019 1 1 0 0 agetty
      [ 1773] 0 1773 1015 1 0 0 0 mingetty
      [ 1775] 0 1775 1015 1 0 0 0 mingetty
      [ 1777] 0 1777 1015 1 0 0 0 mingetty
      [ 1778] 0 1778 2726 0 1 -17 -1000 udevd
      [ 1780] 0 1780 1015 1 1 0 0 mingetty
      [ 1782] 0 1782 1015 1 1 0 0 mingetty
      [ 6441] 0 6441 30133 1 0 0 0 screen
      [ 6442] 0 6442 27511 1 0 0 0 bash
      [ 6474] 0 6474 1100 15 0 0 0 strace
      [ 6477] 0 6477 3515527 1838439 0 0 0 puppet
      [10083] 89 10083 20355 26 0 0 0 pickup
      Out of memory: Kill process 6477 (puppet) score 823 or sacrifice child
      Killed process 6477, UID 0, (puppet) total-vm:14062108kB, anon-rss:7353732kB, file-rss:24kB
      </pre>
      }}
      On my second attempt, I raised the RAM of my VM up to 28GB, and the export was killed the same way.

      On my third attempt, I used 42GB of RAM and this time the export was successful. I can't exactly tell how much memory the process required, but IMHO that's way too much for a 2.2GB database that generates this:
      <pre>
      rw-rr- 1 root root 1.8M Oct 16 14:31 storeconfigs-20131016143132.tar.gz
      </pre>
      After some conversations in #puppet they suggested to file a bug, so here you are

      I straced the whole failed process to a file. If you need it just let me know.

      Best,

      Roger Torrentsgenerós

        Attachments

          Activity

            People

            Assignee:
            Unassigned
            Reporter:
            redmine.exporter redmine.exporter
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved:

                Zendesk Support