evaluasi lisrel 9.10 student edition

81
EVALUASI LISREL 9.10 STUDENT EDITION Oleh : Abdullah M. Jaubah Pendahuluan Lisrel telah dikritik karena kekurangan-kekurangan yang terkandung di dalam paket program tersebut. Gendro Wiyono (2011:395), dalam bukunya yang berjudul Meracang Penelitian Bisnis dengan alat analisis SPSS 17.0 & SmartPLS 2.0, melakukan kritik dengan menyatakan bahwa :”Dibandingkan Structral Equation Modeling (SEM) dengan pendekatan covariance based yang sudah banyak digunakan seperti LISREL, AMOS, EQS, COSAN, dan EZPATH, terdapat dua hal penting dari PLS yang menggunakan variance base, yaitu memiliki kemampun menghindi dua masalah serius. Gendro Wiyono menjelaskan kedua masalah serius itu yaitu inadmissible solution dan factor indeterminacy. Kedua masalah ini dijelaskan sebagai berikut : 1. Inadmissible Solution Yaitu solusi yang tidak dapat diterima, dalam hal ini, pada PLS berbasis varians tidak akan pernah terjadi masalah matriks singulaity. Selain itu, karena PLS bekerj pada model struktural yang bersifat rekursif, maka masalah unidenfified, under-identified atau over- identified juga tidak akan terjadi. 2. Factor Indeterminacy

Upload: david-yu

Post on 29-Oct-2015

320 views

Category:

Documents


0 download

DESCRIPTION

EVALUASI LISREL

TRANSCRIPT

EVALUASI LISREL 9.10 STUDENT EDITIONOleh :

Abdullah M. Jaubah

Pendahuluan

Lisrel telah dikritik karena kekurangan-kekurangan yang terkandung di dalam

paket program tersebut. Gendro Wiyono (2011:395), dalam bukunya yang

berjudul Meracang Penelitian Bisnis dengan alat analisis SPSS 17.0 & SmartPLS

2.0, melakukan kritik dengan menyatakan bahwa :”Dibandingkan Structral

Equation Modeling (SEM) dengan pendekatan covariance based yang sudah

banyak digunakan seperti LISREL, AMOS, EQS, COSAN, dan EZPATH,

terdapat dua hal penting dari PLS yang menggunakan variance base, yaitu

memiliki kemampun menghindi dua masalah serius.

Gendro Wiyono menjelaskan kedua masalah serius itu yaitu inadmissible

solution dan factor indeterminacy. Kedua masalah ini dijelaskan sebagai berikut :

1. Inadmissible Solution

Yaitu solusi yang tidak dapat diterima, dalam hal ini, pada PLS berbasis

varians tidak akan pernah terjadi masalah matriks singulaity. Selain itu,

karena PLS bekerj pada model struktural yang bersifat rekursif, maka

masalah unidenfified, under-identified atau over-identified juga tidak akan

terjadi.

2. Factor Indeterminacy

Factor yang tidak dapat ditentukan, artinya jika terjadi adanya lebih dari

satu faktor yang terdapat dalam sekumpulan indikator sebuah variabel,

khusus indikator yang bersifat formatif tidak memerlukan adaya common

factor, sehingga selalu diperoleh variabel laten yang bersifat komposit.

Dalam hal semacam ini, variabel laten merupakan kominasi linier dari

indikator-indikatornya.

Kritik tersebut tidak disertai contoh dari data yang sama dan dijalankan dalam

PLS dan dijalankan dalam Lisrel sehingga perbedaan-perbedaan yang

terkandung dapat diungkap.

Kelemahan PLS, Visual PLS, dan Smart PLS adalah bahwa contoh-contoh yang

terkandung dalam paket program tersebut sangat terbatas sekali sehingga studi

secara mendalam tidak dimungkinkan dari contoh tersebut. Contoh-contoh yang

terkandung dalam Lisrel 9.10 Student Edition saja tercakup dalam 20 Folder dan

tiap folder mengandung banyak arsip data, arsip sintaksis proyek Lisrel, arsip

sintaksis proyek Simplis, dan arsip sintaksis Prelis. Kelemahan lain dari paket

program PLS, VisualPLS, dan SmartPLS adalah ketiadaan isi lengkap menu

Help.

Tulisan ini disusun untuk menjawab pertanyaan : Mengapakah paket program

Lisrel 9.10 Student Edition itu mengandung 20 folder dan tiap forder

mengandung banyak arsip?

LISREL 9.10 Student Edition

Banyak arsip contoh tersedia dalam Lisrel dan tidak tersedia dalam PLS,

VisualPLS, dan SmartPLS. Arsip-arsip ini disediakan mungkin dengan tujuan

meningkatkan kemampuan kognitif atas paket program Lisrel dengan cara

menjalankan arsip-arsip tersebut.

Penguasaan kemampuan kognitif atas Lisrel 9.10 mencakup kemampuan

mengetahui, kemampuan memahami, kemampuan menerapkan, kemampuan

menganalisis, kemampuan menarik kesimpulan, dan kemampuan mengevaluasi

Lisrel 9.10. Arsip-arsip contoh tersebut dapat dijalankan sehingga akan

menghasilkan informasi dan mungkin akan menghasilkan diagram jalur. Hasil-

hasil yang diperoleh ini dapat ditafsirkan sehingga kemampuan menafsirkan hasil

dan diagram jalur dapat ditingkatkan. Arsip-arsip yang tersedia mungkin

dilakukan dengan tujuan sebagai bahan studi. Hasil-hasil dari pelaksanaan

tersebut juga mungkin dapat dipakai sebagai bahan studi. Beberapa arsip

tersedia dan dapat dijalankan. Hasil-hasil dari beberapa arsip tersebut dapat

diperbandingkan sehingga hasil-hasil itu dapat dipakai sebagai studi

perbandingan.

Kemampuan Lisrel 9.10 Student Edition

Lisrel 9.10 Student Edition mengandung kemampuan di samping kemampuan

melaksanakan pemrograman persamaan struktural dan kemampuan-

kemampuan lain. Yaitu Generalized Linear Modeling, Multilevel Modeling with

Design Weights, Multilevel Structural Equation Modeling, Multilevel Nonlinear

Modeling, Multilevel Modeling, Multivariate Censored Regression, Censored

Regression, MINRES Exploratory Factor Analysis, Full Information Maximum

Likelihood (FIML) for Data with Missing Values, Multiple Imputation for Data with

Missing Values, Formal Inference-based Recursive Modeling (FIRM), Latent

Variable Scores, Robust Standard Errors and Chi-Squares, dan sebagainya.

Apakah peluang-peluang sebagaimana dikemukakan di atas terdapat juga dalam

PLS, VisualPLS, atau SmartPls?

Perbandingan Antara PLS dan Lisrel 9.10

SmartPLS merupakan salah satu metode alternatif dari Structural Equation

Modeling. SmartPLS berhubungan dengan VisualPLS. Salah satu contoh yang

disajikan dengan nama demo.vpl.

Ilmu-ilmu sosial dan perilaku sekarang mengandung permasalahan yang sangat

kompleks dan berbagai ragam faktor perlu dipertimbangkan yaitu faktor-faktor

bersifat kuantitatif atau faktor-faktor bersifat kualitatif dan faktor-faktor yang dapat

diobservasi dan diukur secara langsung dan faktor-faktor yang tidak dapat

diobservasi dan tidak dapat diukur secara langsung.

Partial Least Square, Visual Partial Least Square, Smart Partial Least Square,

atau Generalized Structrured Component Analysis merupakan perangkat-

perangkat dari Sructural Equation Modeling yang dapat dipakai untuk melakukan

analisis variabel-variabel laten, variabel-variabel indikator, dan kesalahan-

kesalahan pengukuran.

Para penganut Partial Least Square menganggap bahwa paket program Lisrel,

Amos, Eqs, Cosan, Ezpath, Ramona, Sepath mengandung masalah yaitu

masalah inadmissible solution dan masalah factor indeterminance. Model-model

dalam PLS dapat dikelompokkan ke dalam model reflektif dan model formatif.

Spesifikasi Model

Analisis hubungan antarvariabel dan indikator terdiri dari outer model, inner

model, dan weight relation. Outer model mencerminkan spesifikasi hubungan

antara variabel laten dan indikator-indikator dari variabel laten bersangkutan.

Outer model dinamakan juga outer relation atau measurement model yaitu model

yang mencerminkan karakteristik variabel laten dan variabel-variabel indikator

atau variabel-variabel manifes bersangkutan. Inner model adalah model yang

mencerminkan spesifikasi hubungan antarvariabel laten (model persamaan

struktural).Weight Relation adalah estimasi nilai dari variabel laten. Inner dan

outer mencerminkan spesifikasi yang diikuti dengan estimasi weight relation.

Pengujian Model

Pengujian model dilakukan melalui outer model dan inner model, model

pengukuran dan model struktural. Pengujian indikator refrektif dilakukan melalui

convergent validity, discriminant validity, atau average variance extracted.

Pengujian indikator formatif dilakukan melalui substantive content-nya.Pengujian

model struktural terarah pada pengujian pengaruh dari suatu variabel laten

terhadap variabel laten lain melalui persentase varians yang dijelaskan yaitu

koefisien determinasi untuk valiabel laten endogen yang mendapat pengaruh

dari variabel laten eksogen. Ukuran yang dipakai adalah stone-geusser Q square

test dan koefisien jalur.

Pengujian outer model dilakukan melalui convergent validity, discriminant validity

atau AVE, dan composite reliability. Pengujian Inner model dilakukan melalui

koefisien determinasi, koefisien parameter, dan t-statistik

Hal ini dapat diringkas dalam tabel di bawah ini :

Model Hasil Kriteria Realiasai EvaluasiModel Outer Convergent Validity Loading factor 0.50 -0.60 (Pengujian Indikator) dan Discriminant Validity Cross Loading > Korelasi variabel laten AVE >0.50 Composite Reliability >=0.70 Model Inner Koefisien Determinasi 0.19 adalah lemah (Pengujian Hipotesis) 0.33 adalah moderat 0.67 adalah kuat Koefisien Parameter Nilai estimasi adalah Statistik-T Signifikan

SmartPLS report

Model: C:\Documents and Settings\user\My Documents\employ.splsmDate: 28.07.2013          

Table of contents (whole)

PLS output Goodness of fit measures

Model data

         

Table of contents

Iterations of the PLS-AlgorithmInner weights (structural model)Outer weights (measurement model)Outer loadings (measurement model)Scores of the latent variablesCorrelations of the latent variables

         

Iterations of the PLS-Algorithm

[ CSV-Version ]

iteration behave1 behave2 behave3 behave40 1 1 1 11 0.316 0.238 0.304 0.3192 0.316 0.238 0.304 0.319

iteration develop1 develop2develop

3develop

40 1 1 1 11 0.292 0.306 0.252 0.3892 0.292 0.305 0.252 0.39

iteration leader1 leader2 leader3 leader40 1 1 1 11 0.246 0.257 0.257 0.1652 0.247 0.258 0.256 0.164

iteration leader5 morale1 morale2 morale30 1 1 1 11 0.247 0.291 0.29 0.2792 0.247 0.291 0.289 0.28

iteration morale4 person1 person2 person30 1 1 1 11 0.272 0.289 0.298 0.3182 0.272 0.288 0.298 0.319

iteration person40 11 0.3552 0.355

Table of contents          

Inner weights (structural model)[ CSV-Version ]

  Leader Behave Develop PersonLeader       0.193Behave       0.363Develop       0.166Person        Morale        

  MoraleLeader  Behave  Develop  Person 0.461Morale  

Table of contents          

Outer weights (measurement model)[ CSV-Version ]

  Leader Behave Develop Personbehave1   0.316    behave2   0.238    behave3   0.304    behave4   0.319    develop1     0.292  develop2     0.305  develop3     0.252  develop4     0.39  leader1 0.247      leader2 0.258      leader3 0.256      leader4 0.164      leader5 0.247      morale1        morale2        morale3        morale4        person1       0.288person2       0.298person3       0.319person4       0.355

  Moralebehave1  behave2  

behave3  behave4  develop1  develop2  develop3  develop4  leader1  leader2  leader3  leader4  leader5  morale1 0.291morale2 0.289morale3 0.28morale4 0.272person1  person2  person3  person4  

Table of contents          

Outer loadings (measurement model)[ CSV-Version ]

  Leader Behave Develop Personbehave1   0.849    behave2   0.796    behave3   0.883    behave4   0.858    develop1     0.804  develop2     0.785  develop3     0.699  develop4     0.898  leader1 0.87      leader2 0.839      leader3 0.878      leader4 0.787      leader5 0.868      morale1        morale2        morale3        morale4        person1       0.767person2       0.766person3       0.817person4       0.816

  Moralebehave1  behave2  behave3  behave4  develop1  develop2  develop3  

develop4  leader1  leader2  leader3  leader4  leader5  morale1 0.91morale2 0.866morale3 0.848morale4 0.909person1  person2  person3  person4  

Table of contents          

Scores of the latent variables[ CSV-Version ]

  Leader Behave Develop Person0 -1.725 -1.438 0.11 -0.8041 1.36 -0.591 1.569 1.5622 0.732 0 1.702 0.8573 1.36 1.081 0.513 0.2414 -1.545 -0.591 -0.777 -1.0815 -1.026 -1.133 -1.047 -0.1256 0.983 -0.242 0.274 1.2237 -1.241 -0.296 -1.288 -1.1148 1.36 1.686 2.424 0.9319 -0.398 -0.896 -0.404 -0.449

10 0.283 0.305 0.756 0.53311 1.127 0.547 2.183 0.85712 -0.166 -0.349 -2.141 -0.46413 1.126 1.148 1.54 -0.13214 0.679 0.843 0.171 0.24115 0.283 -0.846 0.142 -0.71516 -0.631 0.239 0.383 0.20817 0.283 0.547 0.383 0.53318 0.963 1.443 1.81 0.19319 1.197 -2.572 -2.205 1.1920 -1.87 1.686 -0.645 1.23821 0.446 0.547 -0.131 0.53322 0.446 -0.591 -0.536 -0.42323 0.068 0.013 -0.777 0.19324 0.678 0.839 1.026 1.56225 -1.366 -0.591 -0.917 -1.08126 0.445 1.135 0.726 0.19327 0.231 0.239 -1.047 -0.42328 1.36 1.135 0.624 1.56229 1.145 -1.192 -0.502 -0.76330 -0.793 0.547 1.299 0.24131 1.36 0.613 0.997 1.1932 -0.112 -0.283 -0.404 -0.42333 -0.166 0.239 -1.227 0.53334 0.66 0.239 1.399 0.53335 0.912 0.239 -1.047 1.19

36 -1.868 0.547 -0.133 0.53337 1.36 0.305 1.64 1.51538 -0.255 -0.538 -0.028 0.20839 0.283 0.547 -1.077 0.53340 0.283 0.251 0.11 0.19341 0.284 0.893 -0.674 -0.09942 -0.184 -0.591 -0.674 -0.78943 1.36 -0.007 0.272 0.24144 0.049 0.009 -0.534 -0.17245 0.68 -0.591 -0.163 -0.14646 -0.975 -0.85 -1.047 -0.40947 -0.741 0.547 -1.047 0.53348 -0.632 -1.73 0.11 -1.12849 -1.26 -0.833 -0.401 -0.05850 0.283 0.305 -0.129 0.24151 -0.846 -0.246 -1.016 -0.12552 -0.398 -1.126 0.039 -1.40553 1.127 0.856 1.297 -0.17254 -1.87 -2.022 -1.288 0.19355 0.049 0.305 1.267 0.53356 0.893 1.085 1.781 1.1957 -1.403 -0.299 0.11 -1.12858 -0.469 -0.833 0.11 -0.78959 0.894 -0.003 1.913 1.51560 0.893 0.547 0.653 -0.09961 1.36 1.686 1.672 0.65462 0.429 -0.591 1.058 0.53363 -0.973 0.305 -0.777 -0.78964 0.73 0.305 0.515 -0.7365 0.445 1.443 0.383 0.56666 1.127 -0.283 0.756 -0.13967 1.145 0.305 0.756 0.60668 -0.164 -2.572 -1.318 -0.06669 0.283 -0.591 0.11 -0.46470 1.36 -0.057 -0.131 1.51571 -0.864 -0.003 0.142 0.19372 0.121 0.547 1.026 0.24173 -1.637 -1.196 -0.433 0.59174 0.05 0.843 -0.775 0.24175 0.231 -0.053 0.171 -0.44976 0.283 1.135 -0.099 0.19377 -0.43 1.686 -1.114 1.85478 -0.234 -0.296 -0.232 -0.09979 0.445 -0.349 -1.932 -0.43880 0.68 -0.003 1.267 0.53381 -2.103 0.255 1.297 -0.14682 -1.333 -1.717 -1.964 -0.42383 0.499 -0.9 -0.473 -2.77484 1.36 0.305 1.026 0.20885 -2.082 -1.438 -0.47 -1.15486 0.516 0.547 0.383 -0.78987 0.516 0.239 0.653 0.19388 -0.864 -0.771 -0.674 -0.80489 0.283 -1.142 0.483 -0.46490 -1.545 0.547 0.039 -0.7391 0.283 -0.053 0.724 0.53392 1.36 1.443 1.267 1.223

93 -0.183 0.547 -0.981 1.19794 0.518 1.443 1.942 1.85495 0.283 -0.057 0.383 -0.09996 -0.776 1.135 1.399 0.60697 -0.416 0.255 -1.288 0.28298 1.127 0.547 0.383 0.94699 -2.946 -1.784 -0.433 -2.136

100 1.36 -0.538 -1.389 -0.139101 -1.096 -2.268 -2.205 -2.376102 -0.183 0.239 1.267 0.533103 0.893 -1.796 -0.401 0.241104 0.068 -0.296 -0.401 -2.063105 -1.707 -0.9 -0.129 -1.128106 0.446 0.547 0.245 -0.449107 0.051 1.686 1.267 0.533108 1.36 1.443 1.058 1.562109 0.213 1.686 1.267 0.241110 0.446 -0.003 0.441 -0.756111 0.73 0.827 -1.047 0.883112 0.213 -0.833 -0.806 -2.118113 0.213 -1.384 -0.401 -0.667114 0.913 -0.541 1.056 0.857115 -0.003 -1.138 -0.473 -2.376116 0.516 -1.425 0.11 -0.139117 0.428 0 1.672 -0.39118 0.893 -1.972 -0.374 -0.83119 0.912 -0.942 -0.433 -0.782120 -0.955 -1.129 -1.217 -2.476121 -1.26 -2.042 0.758 -2.727122 -1.096 0.013 -0.131 -0.438123 0.913 0.305 0.38 0.193124 0.661 1.151 0.653 1.271125 0.05 0.547 0.995 0.533126 -1.009 1.201 -1.114 -0.671127 0.893 1.686 0.483 1.223128 1.127 0.255 0.383 1.223129 1.126 0.839 -0.274 1.562130 -0.469 -0.016 -0.664 0.193131 -0.792 0.305 -0.131 0.533132 -0.863 0.839 -2.205 -0.161133 -1.258 -0.883 -1.964 -1.42134 -0.864 -1.434 0.11 -1.435135 -0.346 -0.9 0.11 -0.099136 -0.399 0.547 -0.131 0.241137 1.36 1.686 1.366 1.854138 0.679 0.843 1.026 0.533139 0.068 -0.604 -0.372 0.533140 0.662 1.443 1.54 0.857141 1.126 -0.883 -0.504 -0.464142 1.126 0.909 0.414 1.854143 0.283 -0.074 0.139 0.566144 -0.397 0.359 -0.502 0.533145 -1.027 0.009 0.078 -1.786146 0.051 -0.296 -0.565 -0.423147 0.05 -1.434 -0.777 0.193148 1.36 1.686 0.653 0.857149 0.517 0.305 -0.301 0.533

150 -1.078 1.686 -0.806 0.023151 0.679 0.547 0.412 0.533152 -1.079 0.239 -1.934 -2.742153 -1.707 -0.591 -2.205 -2.07154 -1.476 -0.053 -0.372 -0.464155 -1.621 -1.179 -1.482 -0.83156 -0.774 -0.003 -0.806 -0.39157 -0.795 1.686 0.756 1.515158 -0.236 1.069 0.142 -0.479159 0.517 -0.349 -1.286 0.167160 -0.327 -1.196 -0.536 -0.125161 0.893 0.239 0.383 -0.084162 -0.397 -0.995 -1.288 -1.738163 -1.241 -0.041 0.351 -1.453164 -1.311 0.839 -0.263 -0.099165 0.66 0.305 1.026 -0.423166 0.283 0.009 -1.016 0.241167 -1.026 -1.246 -0.913 -1.24168 -1.707 -1.73 0.11 -2.037169 -0.791 0.547 0.485 -0.099170 -1.189 -0.591 -1.318 -0.789171 0.963 0.789 1.026 -1.786172 -2.032 -0.883 -0.261 -1.697173 0.121 -0.003 1.269 0.533174 -0.793 0.534 -0.129 -0.464175 -0.487 0.497 -0.131 -1.121176 -0.183 -0.883 -0.642 -0.789177 1.36 0.547 1.508 0.857178 -1.688 0.079 -1.559 0.315179 1.126 1.151 1.167 -0.139180 0.175 -0.604 -0.131 1.489181 0.679 -0.003 0.142 0.533182 -1.403 0.292 -0.534 -1.128183 -0.399 0.547 -0.777 0.241184 -0.397 -0.833 -1.45 -0.789185 -1.403 -1.179 0.724 -0.125186 -1.636 -1.717 -1.047 -1.095187 0.912 -0.833 1.297 1.562188 -0.398 -0.645 0.11 -0.464189 0.895 -1.767 -1.458 -0.756190 1.127 0.547 1.267 -0.132191 1.127 1.377 0.383 -1.786192 -1.94 -1.488 -2.205 -1.128193 0.965 1.085 1.508 1.854194 -1.026 -1.196 -1.047 -1.42195 0.517 -1.418 -0.099 -0.756196 0.121 1.686 1.67 -2.361197 0.911 -0.296 0.171 0.566198 1.127 -0.296 1.942 -0.789199 -0.978 1.443 0.926 1.854200 1.36 1.377 0.722 1.854201 0.264 0.893 1.54 1.854202 1.035 1.151 1.911 1.529203 -2.407 -2.026 -2.205 -2.402204 0.496 -0.246 1.159 0.544205 -0.166 -1.438 -1.047 -1.745206 0.893 0.547 0.383 -1.453

207 0.446 0.547 -0.131 0.872208 0.446 0.305 0.997 0.241209 -0.327 0.242 1.267 0.215210 0.051 -1.73 -0.775 -0.789211 0.283 -0.591 0.383 0.58212 -2.784 -1.146 -1.934 -0.409213 -1.473 -0.541 -0.674 -0.789214 -0.701 -0.283 0.444 -1.095215 0.283 1.151 1.368 0.533216 0.679 0.305 0.785 -0.763217 0.031 0.547 0.414 1.854218 1.127 -0.941 -1.288 -0.438219 0.912 -2.868 -2.205 -0.044220 0.68 -1.113 -1.318 1.223221 0.983 0.909 2.183 1.854222 0.464 -0.591 -0.099 0.533223 -2.083 0.547 0.11 -0.789224 0.895 0.305 1.399 0.241225 0.446 0.534 -0.777 -0.139226 0.911 1.148 -1.345 0.241227 -2.59 -0.833 -0.534 0.606228 -1.869 -1.796 -1.391 -0.113229 0.892 -2.277 0.997 -1.128230 -2.392 -0.107 -0.915 -2.147231 0.212 0.305 1.058 1.223232 -1.241 -0.337 -0.401 -0.756233 -1.707 -0.591 -1.421 -0.438234 1.127 0.063 -0.775 -2.062235 -0.002 0.255 -1.148 -1.825236 1.36 1.069 1.269 0.518237 1.36 -0.846 0.861 1.515238 0.912 0.789 0.444 0.931239 -0.164 -1.142 -0.293 -0.132240 0.893 0.239 0.785 -0.464241 -2.032 -1.981 -0.748 -2.742242 0.68 0.906 0.11 1.562243 0.049 0.547 1.026 0.533244 1.127 0.777 -0.775 1.197245 0.446 -2.092 -1.393 -0.058246 0.446 1.135 1.267 0.857247 -0.865 -0.9 -0.401 -2.11248 -0.164 -1.434 -0.129 -0.099249 -0.112 0.893 -0.674 1.854250 0.678 0.547 1.026 1.515251 1.36 -1.316 -2.205 -2.815252 -0.184 -0.308 -0.502 -0.438253 1.36 1.377 0.11 1.19254 0.446 -0.85 -1.047 -1.42255 0.283 0.547 0.112 0.241256 -0.397 -2.868 -1.018 -3.432257 0.679 -0.053 1.267 0.591258 0.121 0.305 0.412 -1.971259 0.965 0.547 1.54 0.824260 0.283 -1.138 0.653 0.533261 -0.793 -1.126 -0.401 0.533262 1.36 1.377 1.026 0.898263 -0.182 0.547 -0.131 0.533

264 1.127 1.377 1.299 0.857265 0.446 -1.368 1.267 0.566266 1.197 -0.658 0.112 -0.844267 0.446 -0.833 0.454 -0.464268 0.679 0.305 0.38 -0.423269 -0.164 -1.434 1.026 0.533270 0.446 -1.73 -0.806 -1.461271 -2.784 0.305 0.11 0.566272 -1.01 -0.283 -0.435 0.3273 0.677 1.135 -0.372 0.931274 -0.631 1.686 0.081 1.515275 0.659 -0.95 0.11 -0.756276 -0.631 0.547 -0.263 -0.789277 0.893 0.601 0.624 -0.099278 0.283 -0.003 1.196 -0.464279 0.446 0.547 0.483 -0.538280 -2.713 -0.591 -2.205 -0.763281 0.068 -1.488 -0.025 -2.084282 0.213 -0.296 -1.047 -0.025283 -0.414 -0.346 0.726 -0.099284 -0.185 -0.591 -1.559 0.241285 0.446 0.251 0.11 -0.423286 -1.87 0.305 -0.674 -0.158287 -1.26 -0.9 -0.404 -0.789288 -0.182 -0.591 -1.288 -0.099289 -0.002 -1.421 -1.421 -3.129290 0.214 -0.258 -1.45 0.241291 0.446 -0.349 -0.401 0.2292 0.679 0.534 -0.507 -0.464293 -0.92 0.827 -1.932 -2.084294 -1.493 -1.5 0.108 -0.804295 0.446 0.251 0.383 -0.125296 -2.551 -1.78 -1.662 -1.42297 0.516 -0.003 -1.934 1.854298 1.36 0.839 0.483 0.898299 0.679 0.547 1.026 0.533300 -0.561 0.597 -0.944 -0.464301 1.127 0.305 0.995 0.241302 1.36 1.686 1.54 1.238303 0.426 -1.196 -0.435 -0.464304 -2.48 -2.868 -1.964 1.19305 -2.552 1.201 -0.127 0.883306 0.446 0.067 -0.099 0.533307 0.68 -1.434 -1.318 -0.416308 -0.344 -1.129 0.351 -0.789309 0.894 0.827 -1.318 0.518310 -0.183 0.305 -0.131 0.533311 0.213 -2.56 0.653 -2.037312 -1.171 0.547 -1.288 -0.464313 -0.398 0.547 0.997 0.533314 0.679 0.305 0.112 -0.464315 -0.02 1.377 0.412 -1.128316 1.36 1.686 2.424 1.854317 0.051 -0.554 0.515 0.533318 1.126 1.443 -0.372 1.205319 -0.166 -0.003 0.11 0.241320 1.145 1.443 0.86 1.854

321 -0.166 -1.126 -0.129 1.271322 0.73 -1.391 -0.743 0.193323 -0.325 -0.007 0.11 -0.139324 -0.002 -0.567 -2.205 -1.239325 -1.01 -2.868 -1.691 -0.763326 0.446 0.239 -0.163 0.241327 1.36 1.686 2.424 1.854328 0.354 -0.296 0.383 0.193329 1.145 0.547 -0.303 1.149330 1.127 0.785 0.783 0.559331 0.428 1.686 -1.018 0.931332 0.498 1.686 0.488 1.515333 0.464 0.305 -1.047 0.485334 -0.182 -0.83 1.026 0.533335 0.82 0.785 -0.399 0.533336 0.283 0.547 1.267 0.241337 0.66 1.393 0.142 1.149338 0.679 -0.041 -0.37 -0.043339 0.283 0.305 -0.372 -0.789340 0.283 0.009 -1.29 0.193341 -0.021 -1.73 0.383 -0.756342 1.36 1.686 1.942 1.223343 1.36 1.686 -0.129 0.898344 0.05 0.839 1.569 0.241345 -1.473 -1.196 -0.743 0.533346 0.498 0.547 0.785 0.566347 0.284 -0.003 -1.286 -0.449348 0.678 0.839 0.756 0.606349 0.301 0.547 0.11 -0.561350 0.283 -0.003 0.11 -0.423351 -0.399 1.377 1.781 1.223352 0.194 1.443 -1.047 1.515353 -0.399 -0.591 -0.401 -0.464354 0.283 0.547 0.383 0.193355 0.428 -0.588 1.267 -0.125356 1.127 1.443 1.64 0.857357 -0.845 -1.73 -0.02 -1.095358 1.36 1.686 2.424 1.854359 -0.271 0.777 -0.401 0.267360 0.498 0.251 -0.401 1.529361 -1.403 -0.591 -0.504 -0.789362 0.659 -2.868 0.383 -0.125363 0.679 0.013 1.299 -0.099364 -1.028 0.839 -0.433 -1.081365 -0.436 0.601 0.242 -0.125366 1.145 -0.591 -0.16 -0.423367 0.283 0.893 1.399 0.533368 0.75 0.893 1.569 0.533369 -1.097 -0.591 0.007 0.857370 1.36 1.151 1.942 1.854371 0.283 0.547 1.026 0.533372 1.127 1.443 1.026 1.197373 0.516 0.839 0.756 0.2374 0.051 0.547 0.38 0.533375 -2.55 -2.334 -2.205 -0.12376 1.127 0.531 0.624 1.271377 -0.003 -0.003 0.142 0.872

378 0.283 -0.591 0.213 -0.416379 1.127 0.013 -0.301 1.223380 1.36 1.686 1.702 1.854381 0.283 0.239 0.785 0.533382 -0.811 0.484 0.351 1.223383 -2.459 0.222 -1.964 -0.782384 0.05 -0.591 -0.645 -0.099385 -0.938 -0.85 -0.944 -0.099386 0.68 1.135 1.299 0.533387 -0.559 -1.972 -1.018 1.238388 -2.155 -0.016 0.38 -0.815389 1.36 1.686 -0.642 1.854390 0.283 0.547 0.756 0.606391 -0.974 -2.076 -0.775 0.208392 1.36 0.839 0.515 1.562393 -0.021 0.827 -0.099 1.149394 -0.793 -0.9 -0.674 -0.763395 1.36 0.893 0.219 -0.051396 1.36 0.793 0.621 -1.413397 1.36 0.013 -0.158 -0.486398 0.231 -0.053 0.653 -0.464399 -1.221 1.686 -1.664 0.193400 0.516 1.443 0.11 0.898401 0.731 -0.584 0.11 -0.058402 1.36 0.843 -0.326 -0.561403 0.283 -0.591 -0.775 0.193404 0.231 1.377 0.11 0.85405 -1.026 -0.296 -0.131 -0.132406 0.516 1.443 0.483 0.533407 1.126 -0.057 -0.674 -2.051408 -0.793 -1.73 -0.502 0.606409 -0.399 -0.591 -0.806 0.241410 0.051 0.601 -1.561 0.566411 -2.459 -0.045 -1.016 -0.715412 -1.079 -0.591 -0.261 -1.143413 0.446 1.151 0.11 0.533414 -0.414 -0.9 0.353 -0.423415 0.068 -0.833 -0.099 0.533416 0.283 -0.833 1.399 0.872417 0.051 0.013 0.653 -0.099418 0.051 0.827 0.756 0.566419 -1.94 -1.972 -2.205 -1.372420 -1.097 0.547 0.11 -0.464421 1.36 1.377 -1.016 0.824422 -1.026 -0.591 -0.163 -0.789423 -1.094 0.547 -0.642 1.562424 0.446 0.827 0.383 -0.125425 -1.24 0.547 -0.504 0.857426 0.66 0.584 -0.121 0.687427 -0.58 -0.242 0.039 0.639428 0.445 1.148 0.11 1.562429 -1.383 -0.283 0.112 -1.143430 -1.243 0.063 0.041 -0.099431 -0.955 -0.053 0.11 0.241432 0.051 -0.595 -0.431 -0.051433 1.126 1.135 0.081 1.271434 1.36 0.601 1.911 1.238

435 0.894 0.839 0.995 -0.099436 -2.946 -2.022 0.414 -0.789437 -0.664 1.686 2.183 1.854438 -1.473 0.239 1.297 0.857439 0.517 0.773 0.515 0.872440 0.283 0.856 1.026 0.533441 -0.184 -1.126 0.653 0.533442 0.283 0.547 1.267 0.533443 1.36 -1.73 0.452 0.219444 -0.021 0.584 1.67 -1.446445 0.068 0.305 -0.131 0.533446 -0.164 -0.053 0.483 0.208447 -0.76 0.063 -0.133 1.223448 0.895 -0.349 0.754 0.208449 -1.351 -0.9 -0.163 -0.464450 -0.327 -2.626 -0.131 -1.405451 0.049 0.268 1.267 -0.158452 0.893 0.305 0.894 0.193453 0.68 -0.057 0.756 -0.778454 0.964 1.135 1.026 0.898455 -0.414 -0.229 -0.401 -0.512456 1.36 0.305 -1.013 -0.423457 0.964 -1.196 0.213 1.562458 -0.793 1.686 -0.131 -1.697459 -0.865 1.377 0.142 -0.464460 -2.155 -1.183 -2.205 -0.04461 -0.166 -0.057 -0.806 -0.804462 -1.242 -0.037 -1.148 -0.538463 -1.94 -1.142 -2.205 -1.095464 -1.259 -0.9 -1.047 -1.128465 -1.008 -0.349 -0.16 0.193466 -0.864 -1.126 -0.16 -0.172467 1.127 1.686 0.142 -1.745468 -0.396 -0.296 -0.775 -1.128469 0.446 0.255 1.267 -0.125470 0.679 0.843 -0.372 -0.125471 -1.008 0.305 0.11 0.193472 -0.469 -0.591 -0.16 -1.095473 -0.863 0.551 -0.131 -0.139474 -0.094 0.239 0.383 0.193475 0.894 -0.825 -0.915 -1.143476 -1.096 -1.475 0.483 -0.789477 1.36 -0.053 0.112 -0.423478 0.301 -0.591 -1.391 -0.464479 -1.422 -1.73 -1.047 -1.745480 -2.481 0.534 -1.047 0.241481 0.283 1.135 0.142 0.898482 -0.792 0.255 0.483 0.193483 -1.545 -0.904 -1.32 -0.423484 0.283 0.547 0.997 0.533485 0.283 -0.591 -0.131 -0.464486 1.127 1.686 1.129 1.854487 0.661 1.686 0.517 0.872488 -0.561 0.009 0.11 -0.099489 -0.182 -0.608 -1.047 -0.146490 -1.868 -1.73 -0.676 -2.11491 0.283 0.305 0.142 -0.464

492 0.283 0.305 0.754 0.241493 0.516 -0.591 0.485 -0.39494 0.068 -0.349 -2.205 -0.464495 -0.182 0.839 1.026 0.193496 -0.416 -1.192 -0.16 0.533497 0.66 -0.003 -0.666 -0.449498 0.912 -0.534 -0.565 -0.084499 0.283 0.547 0.483 0.533500 0.913 0.239 0.997 0.533501 -0.021 -1.138 -0.775 0.226502 -1.777 0.239 -2.205 -0.39503 -1.115 -1.126 -1.047 -1.786504 1.127 1.686 0.854 0.241505 0.516 0.239 -0.672 0.178506 0.517 -0.057 0.11 0.559507 1.126 0.547 0.383 0.193508 0.283 0.534 0.274 -0.479509 1.36 0.547 1.54 0.559510 0.283 0.547 0.653 0.193511 0.212 -0.003 -0.16 0.193512 -1.87 1.201 -1.391 -1.062513 -0.956 0.893 0.383 -0.084514 -0.166 0.547 -0.163 0.241515 0.446 0.305 0.785 0.533516 0.283 -0.488 0.653 -1.76517 1.36 -0.591 -0.234 -0.804518 0.051 -0.591 -0.131 -1.42519 -0.682 -1.73 -0.775 -2.11520 -0.327 0.013 0.481 0.533521 1.145 0.239 -1.934 0.559522 -0.397 -0.887 -1.047 -0.497523 -2.55 -2.868 -2.205 -2.051524 0.75 0.009 1.299 0.533525 1.36 1.135 1.399 0.606526 -0.094 0.547 -1.693 1.271527 0.912 -0.27 0.047 -0.686528 1.36 -0.296 0.726 1.562529 1.36 0.547 0.351 0.167530 0.05 -0.346 0.355 0.193531 0.212 1.393 0.997 1.515532 -1.475 0.305 -1.659 -0.11533 0.464 -0.883 -2.205 -1.081534 -1.025 -0.003 0.513 -0.099535 -0.631 0.547 -0.401 -0.099536 0.283 1.135 0.351 0.241537 -0.163 -0.608 0.383 0.219538 -0.631 0.305 -0.565 0.606539 0.534 -0.525 0.653 1.149540 -0.793 0.839 -0.775 -0.099541 1.127 -0.242 1.64 0.533542 -2.154 -0.591 -1.119 -2.701543 1.36 1.686 1.402 1.197544 1.36 0.793 -1.288 0.606545 -0.398 0.547 -1.77 -0.099546 -0.021 -1.488 0.245 -1.446547 1.127 0.601 1.269 0.898548 1.127 0.226 0.383 -0.084

549 -0.793 0.547 1.569 0.518550 -1.87 -0.591 -0.47 -0.789551 1.36 1.085 1.267 1.238552 1.197 -1.004 -1.288 -1.128553 0.285 -0.283 -0.37 -0.423554 -0.397 -0.296 -0.293 -1.128555 -0.793 1.686 0.412 1.562556 -1.87 0.305 -1.047 -0.715557 -1.26 1.135 -0.915 0.134558 1.127 0.305 0.886 0.573559 1.36 -0.003 1.64 1.562560 -0.631 -0.053 -0.234 -0.748561 -0.164 -0.057 0.242 -0.464

  Morale0 -2.2211 1.9242 0.3013 -0.1494 -1.9285 0.5946 0.3467 -1.1438 1.639 0.097

10 0.59411 0.09712 -0.6413 1.67514 -0.19615 0.34616 -1.18517 0.59418 -1.43619 0.09720 -0.68821 0.59422 0.34923 -0.6424 0.64225 -1.4326 -0.14927 -0.14928 1.92429 -1.18530 0.63931 1.4332 0.34933 -1.97534 0.88835 0.88836 -0.69337 0.29838 -0.14939 0.09740 -0.14941 -0.394

42 0.59443 0.89144 1.13645 0.88846 -0.14947 -0.93348 -0.64349 -0.64350 0.151 0.152 -0.34953 -2.22154 -0.93355 0.88856 0.14557 0.14858 -0.14959 1.92460 1.13661 1.92462 -1.18563 -1.18564 0.84365 1.33766 0.64267 1.6368 -2.22169 -0.14970 -0.43971 0.172 -1.13173 -0.14974 0.34975 -0.39476 0.59477 0.34678 0.09779 0.34980 1.38581 0.182 -0.64383 -0.484 1.6385 -0.93686 1.13387 0.34988 -0.14989 -0.68890 -1.18591 -0.14992 1.13693 0.88894 -0.69195 0.63996 -0.39797 -0.44298 0.888

99 -1.389100 -1.389101 -0.649102 -0.394103 -0.394104 -1.679105 -1.433106 -0.69107 -0.149108 1.63109 0.888110 0.888111 0.645112 -1.729113 -0.397114 0.145115 -0.69116 0.349117 -0.196118 0.639119 0.145120 -1.185121 -0.69122 -0.939123 -0.149124 1.63125 0.888126 -0.847127 0.349128 1.675129 1.136130 -0.442131 -0.939132 -0.152133 -0.939134 -0.891135 -0.936136 0.642137 1.924138 1.382139 0.888140 1.43141 1.63142 0.885143 -0.69144 0.346145 -0.643146 -1.185147 -0.643148 0.888149 0.103150 -2.221151 -1.185152 0.594153 -2.221154 0.888155 -1.634

156 -0.394157 1.924158 -0.149159 0.594160 -0.145161 0.84162 -0.646163 -0.149164 -1.682165 1.136166 -0.394167 -0.643168 -0.936169 0.097170 -0.939171 0.888172 -1.23173 0.393174 -0.394175 -0.936176 -0.891177 0.1178 -1.975179 1.63180 -1.386181 -0.196182 -1.928183 0.594184 -2.221185 -1.972186 -1.185187 1.136188 0.594189 0.594190 0.393191 0.1192 -1.185193 -0.193194 -1.185195 0.888196 0.097197 -0.397198 0.594199 0.84200 1.924201 -0.149202 1.924203 -1.478204 0.642205 -0.442206 1.678207 -0.643208 0.097209 0.1210 -0.939211 0.393212 -0.934

213 -0.149214 -0.891215 0.888216 0.594217 -0.394218 0.843219 1.43220 1.426221 1.924222 1.181223 -0.688224 0.594225 0.301226 0.597227 -1.433228 -0.145229 1.385230 -1.389231 0.843232 -0.939233 -0.643234 1.63235 -1.474236 0.888237 1.139238 1.675239 -0.895240 0.097241 -0.889242 -0.149243 0.642244 0.888245 -0.73246 0.888247 -0.149248 0.1249 0.393250 1.385251 -0.978252 0.888253 -0.149254 -0.69255 -0.149256 -0.149257 0.349258 -0.936259 0.888260 0.346261 -0.397262 0.1263 -1.185264 0.888265 0.349266 0.097267 -1.478268 1.382269 0.888

270 -0.394271 -2.221272 0.396273 0.888274 -1.185275 -0.149276 0.393277 1.133278 0.393279 -0.441280 -0.936281 -1.43282 0.346283 -0.101284 0.642285 0.1286 -1.185287 -1.679288 0.103289 -0.936290 -1.184291 -0.149292 0.888293 -1.972294 -1.682295 0.84296 -1.975297 1.63298 1.091299 1.63300 -0.149301 1.136302 1.924303 -1.679304 -1.431305 -1.481306 -1.185307 -0.19308 -0.394309 -0.64310 -0.688311 0.349312 -0.693313 -0.939314 -0.149315 -1.433316 1.924317 -0.442318 1.924319 1.136320 -0.149321 -0.151322 0.888323 0.349324 -1.475325 -1.185326 0.888

327 1.924328 0.888329 1.091330 0.393331 0.84332 -0.149333 0.888334 -0.643335 0.888336 0.349337 1.385338 -2.221339 0.888340 -1.229341 -0.149342 1.924343 0.893344 0.888345 -1.185346 0.888347 0.594348 0.346349 1.63350 -1.185351 -0.149352 1.089353 -0.643354 0.888355 1.136356 -0.149357 -2.221358 0.888359 -0.64360 1.136361 -0.643362 1.044363 0.349364 0.346365 -0.891366 1.337367 1.133368 -0.939369 -1.729370 1.382371 0.349372 0.145373 0.594374 -0.442375 -1.478376 1.924377 -0.688378 -0.688379 0.888380 1.924381 0.594382 0.393383 -1.975

384 -0.149385 -0.394386 -0.394387 -0.643388 0.1389 1.924390 0.349391 -0.098392 1.678393 -0.445394 0.346395 1.924396 -0.391397 0.1398 -0.397399 -0.445400 0.642401 -0.149402 1.63403 0.888404 -0.149405 -0.442406 1.136407 -0.643408 -0.149409 -0.442410 0.097411 -1.682412 -0.149413 0.888414 -0.939415 0.594416 0.594417 0.393418 -0.149419 -1.185420 -1.185421 1.63422 -1.185423 0.145424 0.345425 -0.939426 0.393427 0.642428 -0.149429 0.1430 -1.185431 0.594432 -0.987433 1.678434 1.382435 0.346436 -1.433437 0.891438 -1.185439 0.888440 1.133

441 0.594442 0.642443 0.597444 -0.442445 0.349446 0.393447 0.1448 -1.232449 -0.64450 -0.442451 0.1452 -0.939453 0.1454 1.385455 -0.939456 1.136457 1.924458 -0.442459 0.349460 0.393461 0.097462 -0.693463 -0.69464 -2.064465 -0.69466 -0.397467 0.597468 -1.185469 0.097470 0.888471 -0.149472 -0.19473 -1.433474 -0.397475 -0.391476 -0.936477 1.337478 -0.64479 -1.185480 -1.727481 0.594482 -1.634483 -1.679484 0.888485 -0.149486 1.63487 -0.145488 0.346489 -1.185490 -1.137491 0.594492 1.924493 0.346494 0.642495 -1.185496 0.349497 -0.643

498 0.891499 0.888500 -0.936501 -0.394502 -1.975503 -1.975504 1.337505 -0.148506 0.642507 -0.394508 0.145509 0.642510 0.393511 0.1512 -2.221513 -0.598514 -0.149515 0.393516 0.1517 -0.685518 0.1519 -1.43520 0.594521 0.055522 -0.939523 -1.928524 0.346525 1.337526 0.888527 0.843528 1.924529 0.1530 -0.145531 -0.149532 -0.442533 0.595534 -0.397535 -0.394536 0.349537 -1.679538 -0.394539 -0.149540 0.346541 1.382542 -1.724543 -0.984544 1.385545 -0.397546 -1.43547 1.385548 0.888549 -0.101550 -1.185551 1.136552 0.396553 -1.679554 -0.442

555 0.888556 -0.394557 -0.149558 0.888559 1.63560 0.145561 -0.442

Table of contents          

Correlations of the latent variables[ CSV-Version ]

  Leader Behave Develop PersonLeader 1      Behave 0.358 1    Develop 0.46 0.443 1  Person 0.4 0.506 0.416 1Morale 0.618 0.369 0.415 0.461

  MoraleLeader  Behave  Develop  Person  Morale 1

Table of contents          

Bootstrapping

SmartPLS report

Model: C:\Documents and Settings\user\My Documents\employee.splsmDate: 28.07.2013          

Table of contents (whole)

Bootstrapping results         

Table of contents

Settingsresults for inner weightsresults for outer loadingsresults for outer weightsouter weights for each

sampleouter loadings for each

sample

inner weights for each sample

         

Settings

[ CSV-Version ]

number of cases in original sample 562preprocessing

option no changescases per sample 50

number of samples 100

Table of contents

         

results for inner weights

[ CSV-Version ]

 

original sample

estimate

mean of subsample

sStandard deviation T-Statistic

Leader -> Person 0.193 0.218 0.135 1.43Behave -> Person 0.363 0.349 0.176 2.07Develop -> Person 0.166 0.221 0.081 2.057Person -> Morale 0.461 0.498 0.106 4.363

Table of contents          

results for outer loadings[ CSV-Version ]

 

original sample

estimate

mean of subsample

sStandard deviation T-Statistic

Leader        leader1 0.87 0.862 0.048 18.228leader2 0.839 0.852 0.07 11.942leader3 0.878 0.875 0.038 23.365leader4 0.787 0.79 0.064 12.359leader5 0.868 0.872 0.034 25.639Behave        behave1 0.849 0.842 0.046 18.502behave2 0.796 0.742 0.143 5.578behave3 0.883 0.876 0.029 30.966behave4 0.858 0.859 0.029 29.255Develop        develop1 0.804 0.76 0.134 6.018develop2 0.785 0.759 0.135 5.801develop3 0.699 0.727 0.118 5.918

develop4 0.898 0.884 0.059 15.311Person        person1 0.767 0.788 0.083 9.292person2 0.766 0.763 0.095 8.064person3 0.817 0.805 0.073 11.24person4 0.816 0.812 0.053 15.418Morale        

morale1 0.91 0.907 0.031 28.941morale2 0.866 0.851 0.061 14.229morale3 0.848 0.831 0.052 16.196morale4 0.909 0.894 0.044 20.668

Table of contents          

results for outer weights[ CSV-Version ]

 

original sample

estimate

mean of subsample

sStandard deviation T-Statistic

Leader        leader1 0.247 0.224 0.068 3.628leader2 0.258 0.252 0.068 3.807leader3 0.256 0.262 0.048 5.392leader4 0.164 0.17 0.055 2.981leader5 0.247 0.251 0.053 4.7Behave        behave1 0.316 0.339 0.088 3.601behave2 0.238 0.217 0.108 2.201behave3 0.304 0.294 0.033 9.236behave4 0.319 0.326 0.038 8.481Develop        develop1 0.292 0.257 0.149 1.965develop2 0.305 0.295 0.114 2.662develop3 0.252 0.316 0.117 2.145develop4 0.39 0.351 0.06 6.523Person        person1 0.288 0.296 0.069 4.153person2 0.298 0.289 0.08 3.705person3 0.319 0.316 0.041 7.851person4 0.355 0.346 0.05 7.11Morale        

morale1 0.291 0.298 0.035 8.3morale2 0.289 0.3 0.04 7.25morale3 0.28 0.276 0.047 6.014morale4 0.272 0.271 0.044 6.148

Table of contents          

outer weights for each sample[ CSV-Version ]

  behave1 behave2 behave3 behave40 0.258 0.285 0.297 0.3311 0.258 0.285 0.297 0.3312 0.329 0.218 0.351 0.2643 0.329 0.218 0.351 0.2644 0.329 0.218 0.351 0.264

5 0.329 0.218 0.351 0.2646 0.329 0.218 0.351 0.2647 0.33 0.292 0.243 0.2888 0.33 0.292 0.243 0.2889 0.33 0.292 0.243 0.288

10 0.33 0.292 0.243 0.28811 0.33 0.292 0.243 0.28812 0.396 0.234 0.282 0.30713 0.396 0.234 0.282 0.30714 0.396 0.234 0.282 0.30715 0.396 0.234 0.282 0.30716 0.396 0.234 0.282 0.30717 0.396 0.234 0.282 0.30718 0.287 0.217 0.297 0.34619 0.287 0.217 0.297 0.34620 0.287 0.217 0.297 0.34621 0.287 0.217 0.297 0.34622 0.287 0.217 0.297 0.34623 0.287 0.217 0.297 0.34624 0.265 0.314 0.336 0.29825 0.265 0.314 0.336 0.29826 0.265 0.314 0.336 0.29827 0.265 0.314 0.336 0.29828 0.265 0.314 0.336 0.29829 0.265 0.314 0.336 0.29830 0.279 0.321 0.257 0.30731 0.279 0.321 0.257 0.30732 0.279 0.321 0.257 0.30733 0.279 0.321 0.257 0.30734 0.279 0.321 0.257 0.30735 0.382 0.247 0.223 0.29236 0.382 0.247 0.223 0.29237 0.382 0.247 0.223 0.29238 0.382 0.247 0.223 0.29239 0.382 0.247 0.223 0.29240 0.382 0.247 0.223 0.29241 0.382 0.247 0.223 0.29242 0.222 0.335 0.283 0.3743 0.222 0.335 0.283 0.3744 0.222 0.335 0.283 0.3745 0.222 0.335 0.283 0.3746 0.324 0.2 0.308 0.36847 0.324 0.2 0.308 0.36848 0.324 0.2 0.308 0.36849 0.324 0.2 0.308 0.36850 0.324 0.2 0.308 0.36851 0.324 0.2 0.308 0.36852 0.262 0.247 0.327 0.29653 0.262 0.247 0.327 0.29654 0.262 0.247 0.327 0.29655 0.262 0.247 0.327 0.29656 0.262 0.247 0.327 0.29657 0.262 0.247 0.327 0.29658 0.357 0.172 0.314 0.30859 0.357 0.172 0.314 0.30860 0.357 0.172 0.314 0.30861 0.357 0.172 0.314 0.308

62 0.357 0.172 0.314 0.30863 0.357 0.172 0.314 0.30864 0.565 0.018 0.265 0.39265 0.565 0.018 0.265 0.39266 0.565 0.018 0.265 0.39267 0.565 0.018 0.265 0.39268 0.565 0.018 0.265 0.39269 0.358 0.215 0.283 0.3470 0.358 0.215 0.283 0.3471 0.358 0.215 0.283 0.3472 0.358 0.215 0.283 0.3473 0.358 0.215 0.283 0.3474 0.358 0.215 0.283 0.3475 0.358 0.215 0.283 0.3476 0.368 0.258 0.318 0.2877 0.368 0.258 0.318 0.2878 0.368 0.258 0.318 0.2879 0.368 0.258 0.318 0.2880 0.368 0.258 0.318 0.2881 0.509 -0.119 0.323 0.37682 0.509 -0.119 0.323 0.37683 0.509 -0.119 0.323 0.37684 0.509 -0.119 0.323 0.37685 0.509 -0.119 0.323 0.37686 0.509 -0.119 0.323 0.37687 0.329 0.235 0.301 0.33188 0.329 0.235 0.301 0.33189 0.329 0.235 0.301 0.33190 0.329 0.235 0.301 0.33191 0.329 0.235 0.301 0.33192 0.329 0.235 0.301 0.33193 0.329 0.235 0.301 0.33194 0.178 0.318 0.294 0.38495 0.178 0.318 0.294 0.38496 0.178 0.318 0.294 0.38497 0.178 0.318 0.294 0.38498 0.178 0.318 0.294 0.38499 0.324 0.234 0.259 0.389

  develop1 develop2 develop3 develop40 0.309 0.332 0.306 0.3491 0.309 0.332 0.306 0.3492 0.198 0.412 0.305 0.2793 0.198 0.412 0.305 0.2794 0.198 0.412 0.305 0.2795 0.198 0.412 0.305 0.2796 0.198 0.412 0.305 0.2797 0.33 0.232 0.326 0.3268 0.33 0.232 0.326 0.3269 0.33 0.232 0.326 0.326

10 0.33 0.232 0.326 0.32611 0.33 0.232 0.326 0.32612 0.347 0.13 0.423 0.40913 0.347 0.13 0.423 0.40914 0.347 0.13 0.423 0.40915 0.347 0.13 0.423 0.40916 0.347 0.13 0.423 0.409

17 0.347 0.13 0.423 0.40918 0.357 0.293 0.259 0.27319 0.357 0.293 0.259 0.27320 0.357 0.293 0.259 0.27321 0.357 0.293 0.259 0.27322 0.357 0.293 0.259 0.27323 0.357 0.293 0.259 0.27324 0.524 0.273 0.067 0.32225 0.524 0.273 0.067 0.32226 0.524 0.273 0.067 0.32227 0.524 0.273 0.067 0.32228 0.524 0.273 0.067 0.32229 0.524 0.273 0.067 0.32230 0.231 0.363 0.292 0.33231 0.231 0.363 0.292 0.33232 0.231 0.363 0.292 0.33233 0.231 0.363 0.292 0.33234 0.231 0.363 0.292 0.33235 0.232 0.202 0.369 0.40836 0.232 0.202 0.369 0.40837 0.232 0.202 0.369 0.40838 0.232 0.202 0.369 0.40839 0.232 0.202 0.369 0.40840 0.232 0.202 0.369 0.40841 0.232 0.202 0.369 0.40842 -0.295 0.722 0.448 0.17943 -0.295 0.722 0.448 0.17944 -0.295 0.722 0.448 0.17945 -0.295 0.722 0.448 0.17946 0.234 0.35 0.289 0.39447 0.234 0.35 0.289 0.39448 0.234 0.35 0.289 0.39449 0.234 0.35 0.289 0.39450 0.234 0.35 0.289 0.39451 0.234 0.35 0.289 0.39452 0.219 0.256 0.435 0.32553 0.219 0.256 0.435 0.32554 0.219 0.256 0.435 0.32555 0.219 0.256 0.435 0.32556 0.219 0.256 0.435 0.32557 0.219 0.256 0.435 0.32558 0.29 0.292 0.302 0.40659 0.29 0.292 0.302 0.40660 0.29 0.292 0.302 0.40661 0.29 0.292 0.302 0.40662 0.29 0.292 0.302 0.40663 0.29 0.292 0.302 0.40664 0.14 0.277 0.46 0.3665 0.14 0.277 0.46 0.3666 0.14 0.277 0.46 0.3667 0.14 0.277 0.46 0.3668 0.14 0.277 0.46 0.3669 0.318 0.338 0.21 0.30870 0.318 0.338 0.21 0.30871 0.318 0.338 0.21 0.30872 0.318 0.338 0.21 0.30873 0.318 0.338 0.21 0.308

74 0.318 0.338 0.21 0.30875 0.318 0.338 0.21 0.30876 0.228 0.382 0.172 0.36377 0.228 0.382 0.172 0.36378 0.228 0.382 0.172 0.36379 0.228 0.382 0.172 0.36380 0.228 0.382 0.172 0.36381 0.094 0.169 0.57 0.37982 0.094 0.169 0.57 0.37983 0.094 0.169 0.57 0.37984 0.094 0.169 0.57 0.37985 0.094 0.169 0.57 0.37986 0.094 0.169 0.57 0.37987 0.373 0.204 0.23 0.44388 0.373 0.204 0.23 0.44389 0.373 0.204 0.23 0.44390 0.373 0.204 0.23 0.44391 0.373 0.204 0.23 0.44392 0.373 0.204 0.23 0.44393 0.373 0.204 0.23 0.44394 0.315 0.301 0.286 0.35295 0.315 0.301 0.286 0.35296 0.315 0.301 0.286 0.35297 0.315 0.301 0.286 0.35298 0.315 0.301 0.286 0.35299 0.154 0.362 0.24 0.461

  leader1 leader2 leader3 leader40 0.351 0.222 0.227 0.1461 0.351 0.222 0.227 0.1462 0.203 0.361 0.14 0.1413 0.203 0.361 0.14 0.1414 0.203 0.361 0.14 0.1415 0.203 0.361 0.14 0.1416 0.203 0.361 0.14 0.1417 0.276 0.256 0.307 0.1828 0.276 0.256 0.307 0.1829 0.276 0.256 0.307 0.182

10 0.276 0.256 0.307 0.18211 0.276 0.256 0.307 0.18212 0.157 0.289 0.27 0.17413 0.157 0.289 0.27 0.17414 0.157 0.289 0.27 0.17415 0.157 0.289 0.27 0.17416 0.157 0.289 0.27 0.17417 0.157 0.289 0.27 0.17418 0.266 0.256 0.258 0.17319 0.266 0.256 0.258 0.17320 0.266 0.256 0.258 0.17321 0.266 0.256 0.258 0.17322 0.266 0.256 0.258 0.17323 0.266 0.256 0.258 0.17324 0.243 0.226 0.239 0.13425 0.243 0.226 0.239 0.13426 0.243 0.226 0.239 0.13427 0.243 0.226 0.239 0.13428 0.243 0.226 0.239 0.134

29 0.243 0.226 0.239 0.13430 0.263 0.238 0.231 0.1531 0.263 0.238 0.231 0.1532 0.263 0.238 0.231 0.1533 0.263 0.238 0.231 0.1534 0.263 0.238 0.231 0.1535 0.257 0.289 0.357 0.12436 0.257 0.289 0.357 0.12437 0.257 0.289 0.357 0.12438 0.257 0.289 0.357 0.12439 0.257 0.289 0.357 0.12440 0.257 0.289 0.357 0.12441 0.257 0.289 0.357 0.12442 0.222 0.313 0.211 0.18143 0.222 0.313 0.211 0.18144 0.222 0.313 0.211 0.18145 0.222 0.313 0.211 0.18146 0.128 0.326 0.302 0.16647 0.128 0.326 0.302 0.16648 0.128 0.326 0.302 0.16649 0.128 0.326 0.302 0.16650 0.128 0.326 0.302 0.16651 0.128 0.326 0.302 0.16652 0.244 0.245 0.271 0.14553 0.244 0.245 0.271 0.14554 0.244 0.245 0.271 0.14555 0.244 0.245 0.271 0.14556 0.244 0.245 0.271 0.14557 0.244 0.245 0.271 0.14558 0.061 0.345 0.29 0.10359 0.061 0.345 0.29 0.10360 0.061 0.345 0.29 0.10361 0.061 0.345 0.29 0.10362 0.061 0.345 0.29 0.10363 0.061 0.345 0.29 0.10364 0.278 0.188 0.267 0.27565 0.278 0.188 0.267 0.27566 0.278 0.188 0.267 0.27567 0.278 0.188 0.267 0.27568 0.278 0.188 0.267 0.27569 0.223 0.235 0.215 0.21970 0.223 0.235 0.215 0.21971 0.223 0.235 0.215 0.21972 0.223 0.235 0.215 0.21973 0.223 0.235 0.215 0.21974 0.223 0.235 0.215 0.21975 0.223 0.235 0.215 0.21976 0.267 0.063 0.32 0.18977 0.267 0.063 0.32 0.18978 0.267 0.063 0.32 0.18979 0.267 0.063 0.32 0.18980 0.267 0.063 0.32 0.18981 0.122 0.26 0.231 0.31482 0.122 0.26 0.231 0.31483 0.122 0.26 0.231 0.31484 0.122 0.26 0.231 0.31485 0.122 0.26 0.231 0.314

86 0.122 0.26 0.231 0.31487 0.296 0.244 0.246 0.09288 0.296 0.244 0.246 0.09289 0.296 0.244 0.246 0.09290 0.296 0.244 0.246 0.09291 0.296 0.244 0.246 0.09292 0.296 0.244 0.246 0.09293 0.296 0.244 0.246 0.09294 0.285 0.131 0.282 0.16695 0.285 0.131 0.282 0.16696 0.285 0.131 0.282 0.16697 0.285 0.131 0.282 0.16698 0.285 0.131 0.282 0.16699 0.228 0.248 0.237 0.186

  leader5 morale1 morale2 morale30 0.204 0.253 0.316 0.2441 0.204 0.253 0.316 0.2442 0.288 0.257 0.228 0.3523 0.288 0.257 0.228 0.3524 0.288 0.257 0.228 0.3525 0.288 0.257 0.228 0.3526 0.288 0.257 0.228 0.3527 0.171 0.298 0.303 0.2658 0.171 0.298 0.303 0.2659 0.171 0.298 0.303 0.265

10 0.171 0.298 0.303 0.26511 0.171 0.298 0.303 0.26512 0.286 0.312 0.269 0.3213 0.286 0.312 0.269 0.3214 0.286 0.312 0.269 0.3215 0.286 0.312 0.269 0.3216 0.286 0.312 0.269 0.3217 0.286 0.312 0.269 0.3218 0.262 0.312 0.359 0.23419 0.262 0.312 0.359 0.23420 0.262 0.312 0.359 0.23421 0.262 0.312 0.359 0.23422 0.262 0.312 0.359 0.23423 0.262 0.312 0.359 0.23424 0.331 0.284 0.252 0.28325 0.331 0.284 0.252 0.28326 0.331 0.284 0.252 0.28327 0.331 0.284 0.252 0.28328 0.331 0.284 0.252 0.28329 0.331 0.284 0.252 0.28330 0.249 0.357 0.326 0.15331 0.249 0.357 0.326 0.15332 0.249 0.357 0.326 0.15333 0.249 0.357 0.326 0.15334 0.249 0.357 0.326 0.15335 0.128 0.305 0.295 0.26236 0.128 0.305 0.295 0.26237 0.128 0.305 0.295 0.26238 0.128 0.305 0.295 0.26239 0.128 0.305 0.295 0.26240 0.128 0.305 0.295 0.262

41 0.128 0.305 0.295 0.26242 0.264 0.301 0.321 0.23443 0.264 0.301 0.321 0.23444 0.264 0.301 0.321 0.23445 0.264 0.301 0.321 0.23446 0.192 0.208 0.31 0.37847 0.192 0.208 0.31 0.37848 0.192 0.208 0.31 0.37849 0.192 0.208 0.31 0.37850 0.192 0.208 0.31 0.37851 0.192 0.208 0.31 0.37852 0.254 0.334 0.222 0.27853 0.254 0.334 0.222 0.27854 0.254 0.334 0.222 0.27855 0.254 0.334 0.222 0.27856 0.254 0.334 0.222 0.27857 0.254 0.334 0.222 0.27858 0.34 0.287 0.284 0.2659 0.34 0.287 0.284 0.2660 0.34 0.287 0.284 0.2661 0.34 0.287 0.284 0.2662 0.34 0.287 0.284 0.2663 0.34 0.287 0.284 0.2664 0.24 0.285 0.32 0.29665 0.24 0.285 0.32 0.29666 0.24 0.285 0.32 0.29667 0.24 0.285 0.32 0.29668 0.24 0.285 0.32 0.29669 0.237 0.35 0.298 0.29970 0.237 0.35 0.298 0.29971 0.237 0.35 0.298 0.29972 0.237 0.35 0.298 0.29973 0.237 0.35 0.298 0.29974 0.237 0.35 0.298 0.29975 0.237 0.35 0.298 0.29976 0.282 0.335 0.386 0.25177 0.282 0.335 0.386 0.25178 0.282 0.335 0.386 0.25179 0.282 0.335 0.386 0.25180 0.282 0.335 0.386 0.25181 0.255 0.289 0.318 0.27982 0.255 0.289 0.318 0.27983 0.255 0.289 0.318 0.27984 0.255 0.289 0.318 0.27985 0.255 0.289 0.318 0.27986 0.255 0.289 0.318 0.27987 0.259 0.275 0.329 0.28388 0.259 0.275 0.329 0.28389 0.259 0.275 0.329 0.28390 0.259 0.275 0.329 0.28391 0.259 0.275 0.329 0.28392 0.259 0.275 0.329 0.28393 0.259 0.275 0.329 0.28394 0.28 0.282 0.301 0.24595 0.28 0.282 0.301 0.24596 0.28 0.282 0.301 0.24597 0.28 0.282 0.301 0.245

98 0.28 0.282 0.301 0.24599 0.231 0.317 0.259 0.291

  morale4 person1 person2 person30 0.269 0.374 0.245 0.2761 0.269 0.374 0.245 0.2762 0.283 0.326 0.267 0.3253 0.283 0.326 0.267 0.3254 0.283 0.326 0.267 0.3255 0.283 0.326 0.267 0.3256 0.283 0.326 0.267 0.3257 0.251 0.132 0.466 0.3588 0.251 0.132 0.466 0.3589 0.251 0.132 0.466 0.358

10 0.251 0.132 0.466 0.35811 0.251 0.132 0.466 0.35812 0.3 0.234 0.331 0.35913 0.3 0.234 0.331 0.35914 0.3 0.234 0.331 0.35915 0.3 0.234 0.331 0.35916 0.3 0.234 0.331 0.35917 0.3 0.234 0.331 0.35918 0.21 0.301 0.248 0.31819 0.21 0.301 0.248 0.31820 0.21 0.301 0.248 0.31821 0.21 0.301 0.248 0.31822 0.21 0.301 0.248 0.31823 0.21 0.301 0.248 0.31824 0.277 0.41 0.16 0.37725 0.277 0.41 0.16 0.37726 0.277 0.41 0.16 0.37727 0.277 0.41 0.16 0.37728 0.277 0.41 0.16 0.37729 0.277 0.41 0.16 0.37730 0.313 0.338 0.318 0.31731 0.313 0.338 0.318 0.31732 0.313 0.338 0.318 0.31733 0.313 0.338 0.318 0.31734 0.313 0.338 0.318 0.31735 0.316 0.255 0.321 0.26736 0.316 0.255 0.321 0.26737 0.316 0.255 0.321 0.26738 0.316 0.255 0.321 0.26739 0.316 0.255 0.321 0.26740 0.316 0.255 0.321 0.26741 0.316 0.255 0.321 0.26742 0.281 0.208 0.311 0.31743 0.281 0.208 0.311 0.31744 0.281 0.208 0.311 0.31745 0.281 0.208 0.311 0.31746 0.267 0.3 0.314 0.32447 0.267 0.3 0.314 0.32448 0.267 0.3 0.314 0.32449 0.267 0.3 0.314 0.32450 0.267 0.3 0.314 0.32451 0.267 0.3 0.314 0.32452 0.382 0.262 0.254 0.319

53 0.382 0.262 0.254 0.31954 0.382 0.262 0.254 0.31955 0.382 0.262 0.254 0.31956 0.382 0.262 0.254 0.31957 0.382 0.262 0.254 0.31958 0.26 0.338 0.207 0.3759 0.26 0.338 0.207 0.3760 0.26 0.338 0.207 0.3761 0.26 0.338 0.207 0.3762 0.26 0.338 0.207 0.3763 0.26 0.338 0.207 0.3764 0.235 0.385 0.172 0.37865 0.235 0.385 0.172 0.37866 0.235 0.385 0.172 0.37867 0.235 0.385 0.172 0.37868 0.235 0.385 0.172 0.37869 0.225 0.308 0.362 0.25970 0.225 0.308 0.362 0.25971 0.225 0.308 0.362 0.25972 0.225 0.308 0.362 0.25973 0.225 0.308 0.362 0.25974 0.225 0.308 0.362 0.25975 0.225 0.308 0.362 0.25976 0.196 0.17 0.46 0.29677 0.196 0.17 0.46 0.29678 0.196 0.17 0.46 0.29679 0.196 0.17 0.46 0.29680 0.196 0.17 0.46 0.29681 0.223 0.3 0.205 0.382 0.223 0.3 0.205 0.383 0.223 0.3 0.205 0.384 0.223 0.3 0.205 0.385 0.223 0.3 0.205 0.386 0.223 0.3 0.205 0.387 0.3 0.332 0.275 0.2988 0.3 0.332 0.275 0.2989 0.3 0.332 0.275 0.2990 0.3 0.332 0.275 0.2991 0.3 0.332 0.275 0.2992 0.3 0.332 0.275 0.2993 0.3 0.332 0.275 0.2994 0.273 0.375 0.277 0.23495 0.273 0.375 0.277 0.23496 0.273 0.375 0.277 0.23497 0.273 0.375 0.277 0.23498 0.273 0.375 0.277 0.23499 0.274 0.294 0.33 0.318

  person40 0.3821 0.3822 0.313 0.314 0.315 0.316 0.317 0.333

8 0.3339 0.333

10 0.33311 0.33312 0.34213 0.34214 0.34215 0.34216 0.34217 0.34218 0.3419 0.3420 0.3421 0.3422 0.3423 0.3424 0.33325 0.33326 0.33327 0.33328 0.33329 0.33330 0.37731 0.37732 0.37733 0.37734 0.37735 0.30936 0.30937 0.30938 0.30939 0.30940 0.30941 0.30942 0.38743 0.38744 0.38745 0.38746 0.28747 0.28748 0.28749 0.28750 0.28751 0.28752 0.3953 0.3954 0.3955 0.3956 0.3957 0.3958 0.38659 0.38660 0.38661 0.38662 0.38663 0.38664 0.341

65 0.34166 0.34167 0.34168 0.34169 0.29470 0.29471 0.29472 0.29473 0.29474 0.29475 0.29476 0.43477 0.43478 0.43479 0.43480 0.43481 0.39882 0.39883 0.39884 0.39885 0.39886 0.39887 0.2688 0.2689 0.2690 0.2691 0.2692 0.2693 0.2694 0.43795 0.43796 0.43797 0.43798 0.43799 0.256

Table of contents          

outer loadings for each sample[ CSV-Version ]

  behave1 behave2 behave3 behave40 0.843 0.789 0.878 0.8961 0.843 0.789 0.878 0.8962 0.893 0.829 0.881 0.8223 0.893 0.829 0.881 0.8224 0.893 0.829 0.881 0.8225 0.893 0.829 0.881 0.8226 0.893 0.829 0.881 0.8227 0.89 0.843 0.893 0.8468 0.89 0.843 0.893 0.8469 0.89 0.843 0.893 0.846

10 0.89 0.843 0.893 0.84611 0.89 0.843 0.893 0.84612 0.802 0.799 0.856 0.82513 0.802 0.799 0.856 0.82514 0.802 0.799 0.856 0.825

15 0.802 0.799 0.856 0.82516 0.802 0.799 0.856 0.82517 0.802 0.799 0.856 0.82518 0.856 0.804 0.881 0.91919 0.856 0.804 0.881 0.91920 0.856 0.804 0.881 0.91921 0.856 0.804 0.881 0.91922 0.856 0.804 0.881 0.91923 0.856 0.804 0.881 0.91924 0.788 0.812 0.875 0.81725 0.788 0.812 0.875 0.81726 0.788 0.812 0.875 0.81727 0.788 0.812 0.875 0.81728 0.788 0.812 0.875 0.81729 0.788 0.812 0.875 0.81730 0.819 0.865 0.874 0.87731 0.819 0.865 0.874 0.87732 0.819 0.865 0.874 0.87733 0.819 0.865 0.874 0.87734 0.819 0.865 0.874 0.87735 0.919 0.738 0.922 0.89236 0.919 0.738 0.922 0.89237 0.919 0.738 0.922 0.89238 0.919 0.738 0.922 0.89239 0.919 0.738 0.922 0.89240 0.919 0.738 0.922 0.89241 0.919 0.738 0.922 0.89242 0.785 0.826 0.846 0.83543 0.785 0.826 0.846 0.83544 0.785 0.826 0.846 0.83545 0.785 0.826 0.846 0.83546 0.807 0.707 0.891 0.87647 0.807 0.707 0.891 0.87648 0.807 0.707 0.891 0.87649 0.807 0.707 0.891 0.87650 0.807 0.707 0.891 0.87651 0.807 0.707 0.891 0.87652 0.891 0.815 0.93 0.88353 0.891 0.815 0.93 0.88354 0.891 0.815 0.93 0.88355 0.891 0.815 0.93 0.88356 0.891 0.815 0.93 0.88357 0.891 0.815 0.93 0.88358 0.899 0.755 0.894 0.87159 0.899 0.755 0.894 0.87160 0.899 0.755 0.894 0.87161 0.899 0.755 0.894 0.87162 0.899 0.755 0.894 0.87163 0.899 0.755 0.894 0.87164 0.809 0.43 0.818 0.81265 0.809 0.43 0.818 0.81266 0.809 0.43 0.818 0.81267 0.809 0.43 0.818 0.81268 0.809 0.43 0.818 0.81269 0.833 0.783 0.845 0.86870 0.833 0.783 0.845 0.86871 0.833 0.783 0.845 0.868

72 0.833 0.783 0.845 0.86873 0.833 0.783 0.845 0.86874 0.833 0.783 0.845 0.86875 0.833 0.783 0.845 0.86876 0.82 0.755 0.855 0.82777 0.82 0.755 0.855 0.82778 0.82 0.755 0.855 0.82779 0.82 0.755 0.855 0.82780 0.82 0.755 0.855 0.82781 0.875 0.304 0.837 0.85282 0.875 0.304 0.837 0.85283 0.875 0.304 0.837 0.85284 0.875 0.304 0.837 0.85285 0.875 0.304 0.837 0.85286 0.875 0.304 0.837 0.85287 0.848 0.716 0.893 0.85888 0.848 0.716 0.893 0.85889 0.848 0.716 0.893 0.85890 0.848 0.716 0.893 0.85891 0.848 0.716 0.893 0.85892 0.848 0.716 0.893 0.85893 0.848 0.716 0.893 0.85894 0.762 0.856 0.863 0.87995 0.762 0.856 0.863 0.87996 0.762 0.856 0.863 0.87997 0.762 0.856 0.863 0.87998 0.762 0.856 0.863 0.87999 0.736 0.81 0.891 0.878

  develop1 develop2 develop3 develop40 0.833 0.705 0.689 0.8541 0.833 0.705 0.689 0.8542 0.771 0.877 0.84 0.8233 0.771 0.877 0.84 0.8234 0.771 0.877 0.84 0.8235 0.771 0.877 0.84 0.8236 0.771 0.877 0.84 0.8237 0.844 0.74 0.795 0.898 0.844 0.74 0.795 0.899 0.844 0.74 0.795 0.89

10 0.844 0.74 0.795 0.8911 0.844 0.74 0.795 0.8912 0.783 0.299 0.755 0.90413 0.783 0.299 0.755 0.90414 0.783 0.299 0.755 0.90415 0.783 0.299 0.755 0.90416 0.783 0.299 0.755 0.90417 0.783 0.299 0.755 0.90418 0.898 0.82 0.752 0.89619 0.898 0.82 0.752 0.89620 0.898 0.82 0.752 0.89621 0.898 0.82 0.752 0.89622 0.898 0.82 0.752 0.89623 0.898 0.82 0.752 0.89624 0.922 0.724 0.387 0.91225 0.922 0.724 0.387 0.91226 0.922 0.724 0.387 0.912

27 0.922 0.724 0.387 0.91228 0.922 0.724 0.387 0.91229 0.922 0.724 0.387 0.91230 0.731 0.797 0.823 0.90831 0.731 0.797 0.823 0.90832 0.731 0.797 0.823 0.90833 0.731 0.797 0.823 0.90834 0.731 0.797 0.823 0.90835 0.786 0.712 0.789 0.93736 0.786 0.712 0.789 0.93737 0.786 0.712 0.789 0.93738 0.786 0.712 0.789 0.93739 0.786 0.712 0.789 0.93740 0.786 0.712 0.789 0.93741 0.786 0.712 0.789 0.93742 0.281 0.914 0.681 0.65343 0.281 0.914 0.681 0.65344 0.281 0.914 0.681 0.65345 0.281 0.914 0.681 0.65346 0.698 0.861 0.664 0.87247 0.698 0.861 0.664 0.87248 0.698 0.861 0.664 0.87249 0.698 0.861 0.664 0.87250 0.698 0.861 0.664 0.87251 0.698 0.861 0.664 0.87252 0.674 0.759 0.817 0.9353 0.674 0.759 0.817 0.9354 0.674 0.759 0.817 0.9355 0.674 0.759 0.817 0.9356 0.674 0.759 0.817 0.9357 0.674 0.759 0.817 0.9358 0.687 0.723 0.766 0.88259 0.687 0.723 0.766 0.88260 0.687 0.723 0.766 0.88261 0.687 0.723 0.766 0.88262 0.687 0.723 0.766 0.88263 0.687 0.723 0.766 0.88264 0.687 0.781 0.828 0.8565 0.687 0.781 0.828 0.8566 0.687 0.781 0.828 0.8567 0.687 0.781 0.828 0.8568 0.687 0.781 0.828 0.8569 0.87 0.882 0.65 0.93770 0.87 0.882 0.65 0.93771 0.87 0.882 0.65 0.93772 0.87 0.882 0.65 0.93773 0.87 0.882 0.65 0.93774 0.87 0.882 0.65 0.93775 0.87 0.882 0.65 0.93776 0.841 0.885 0.734 0.94977 0.841 0.885 0.734 0.94978 0.841 0.885 0.734 0.94979 0.841 0.885 0.734 0.94980 0.841 0.885 0.734 0.94981 0.579 0.675 0.908 0.82982 0.579 0.675 0.908 0.82983 0.579 0.675 0.908 0.829

84 0.579 0.675 0.908 0.82985 0.579 0.675 0.908 0.82986 0.579 0.675 0.908 0.82987 0.855 0.732 0.605 0.88888 0.855 0.732 0.605 0.88889 0.855 0.732 0.605 0.88890 0.855 0.732 0.605 0.88891 0.855 0.732 0.605 0.88892 0.855 0.732 0.605 0.88893 0.855 0.732 0.605 0.88894 0.834 0.82 0.64 0.87395 0.834 0.82 0.64 0.87396 0.834 0.82 0.64 0.87397 0.834 0.82 0.64 0.87398 0.834 0.82 0.64 0.87399 0.674 0.882 0.645 0.915

  leader1 leader2 leader3 leader40 0.912 0.82 0.919 0.7731 0.912 0.82 0.919 0.7732 0.908 0.92 0.794 0.7693 0.908 0.92 0.794 0.7694 0.908 0.92 0.794 0.7695 0.908 0.92 0.794 0.7696 0.908 0.92 0.794 0.7697 0.83 0.744 0.879 0.8938 0.83 0.744 0.879 0.8939 0.83 0.744 0.879 0.893

10 0.83 0.744 0.879 0.89311 0.83 0.744 0.879 0.89312 0.843 0.922 0.857 0.78113 0.843 0.922 0.857 0.78114 0.843 0.922 0.857 0.78115 0.843 0.922 0.857 0.78116 0.843 0.922 0.857 0.78117 0.843 0.922 0.857 0.78118 0.831 0.833 0.857 0.7519 0.831 0.833 0.857 0.7520 0.831 0.833 0.857 0.7521 0.831 0.833 0.857 0.7522 0.831 0.833 0.857 0.7523 0.831 0.833 0.857 0.7524 0.835 0.836 0.857 0.76625 0.835 0.836 0.857 0.76626 0.835 0.836 0.857 0.76627 0.835 0.836 0.857 0.76628 0.835 0.836 0.857 0.76629 0.835 0.836 0.857 0.76630 0.901 0.891 0.856 0.84631 0.901 0.891 0.856 0.84632 0.901 0.891 0.856 0.84633 0.901 0.891 0.856 0.84634 0.901 0.891 0.856 0.84635 0.841 0.9 0.904 0.7736 0.841 0.9 0.904 0.7737 0.841 0.9 0.904 0.7738 0.841 0.9 0.904 0.77

39 0.841 0.9 0.904 0.7740 0.841 0.9 0.904 0.7741 0.841 0.9 0.904 0.7742 0.845 0.87 0.849 0.70843 0.845 0.87 0.849 0.70844 0.845 0.87 0.849 0.70845 0.845 0.87 0.849 0.70846 0.797 0.917 0.944 0.87647 0.797 0.917 0.944 0.87648 0.797 0.917 0.944 0.87649 0.797 0.917 0.944 0.87650 0.797 0.917 0.944 0.87651 0.797 0.917 0.944 0.87652 0.832 0.826 0.896 0.85453 0.832 0.826 0.896 0.85454 0.832 0.826 0.896 0.85455 0.832 0.826 0.896 0.85456 0.832 0.826 0.896 0.85457 0.832 0.826 0.896 0.85458 0.768 0.912 0.911 0.69759 0.768 0.912 0.911 0.69760 0.768 0.912 0.911 0.69761 0.768 0.912 0.911 0.69762 0.768 0.912 0.911 0.69763 0.768 0.912 0.911 0.69764 0.871 0.723 0.834 0.7465 0.871 0.723 0.834 0.7466 0.871 0.723 0.834 0.7467 0.871 0.723 0.834 0.7468 0.871 0.723 0.834 0.7469 0.918 0.892 0.883 0.86870 0.918 0.892 0.883 0.86871 0.918 0.892 0.883 0.86872 0.918 0.892 0.883 0.86873 0.918 0.892 0.883 0.86874 0.918 0.892 0.883 0.86875 0.918 0.892 0.883 0.86876 0.927 0.666 0.939 0.82577 0.927 0.666 0.939 0.82578 0.927 0.666 0.939 0.82579 0.927 0.666 0.939 0.82580 0.927 0.666 0.939 0.82581 0.838 0.883 0.817 0.8382 0.838 0.883 0.817 0.8383 0.838 0.883 0.817 0.8384 0.838 0.883 0.817 0.8385 0.838 0.883 0.817 0.8386 0.838 0.883 0.817 0.8387 0.928 0.884 0.87 0.67288 0.928 0.884 0.87 0.67289 0.928 0.884 0.87 0.67290 0.928 0.884 0.87 0.67291 0.928 0.884 0.87 0.67292 0.928 0.884 0.87 0.67293 0.928 0.884 0.87 0.67294 0.923 0.791 0.887 0.7995 0.923 0.791 0.887 0.79

96 0.923 0.791 0.887 0.7997 0.923 0.791 0.887 0.7998 0.923 0.791 0.887 0.7999 0.92 0.903 0.887 0.834

  leader5 morale1 morale2 morale30 0.863 0.954 0.903 0.8881 0.863 0.954 0.903 0.8882 0.918 0.909 0.845 0.8953 0.918 0.909 0.845 0.8954 0.918 0.909 0.845 0.8955 0.918 0.909 0.845 0.8956 0.918 0.909 0.845 0.8957 0.87 0.922 0.878 0.8558 0.87 0.922 0.878 0.8559 0.87 0.922 0.878 0.855

10 0.87 0.922 0.878 0.85511 0.87 0.922 0.878 0.85512 0.816 0.881 0.775 0.77913 0.816 0.881 0.775 0.77914 0.816 0.881 0.775 0.77915 0.816 0.881 0.775 0.77916 0.816 0.881 0.775 0.77917 0.816 0.881 0.775 0.77918 0.821 0.938 0.895 0.85119 0.821 0.938 0.895 0.85120 0.821 0.938 0.895 0.85121 0.821 0.938 0.895 0.85122 0.821 0.938 0.895 0.85123 0.821 0.938 0.895 0.85124 0.905 0.948 0.897 0.86125 0.905 0.948 0.897 0.86126 0.905 0.948 0.897 0.86127 0.905 0.948 0.897 0.86128 0.905 0.948 0.897 0.86129 0.905 0.948 0.897 0.86130 0.907 0.916 0.84 0.73831 0.907 0.916 0.84 0.73832 0.907 0.916 0.84 0.73833 0.907 0.916 0.84 0.73834 0.907 0.916 0.84 0.73835 0.82 0.889 0.776 0.80736 0.82 0.889 0.776 0.80737 0.82 0.889 0.776 0.80738 0.82 0.889 0.776 0.80739 0.82 0.889 0.776 0.80740 0.82 0.889 0.776 0.80741 0.82 0.889 0.776 0.80742 0.883 0.933 0.882 0.74943 0.883 0.933 0.882 0.74944 0.883 0.933 0.882 0.74945 0.883 0.933 0.882 0.74946 0.88 0.831 0.846 0.85747 0.88 0.831 0.846 0.85748 0.88 0.831 0.846 0.85749 0.88 0.831 0.846 0.85750 0.88 0.831 0.846 0.857

51 0.88 0.831 0.846 0.85752 0.898 0.883 0.726 0.75553 0.898 0.883 0.726 0.75554 0.898 0.883 0.726 0.75555 0.898 0.883 0.726 0.75556 0.898 0.883 0.726 0.75557 0.898 0.883 0.726 0.75558 0.889 0.936 0.929 0.8859 0.889 0.936 0.929 0.8860 0.889 0.936 0.929 0.8861 0.889 0.936 0.929 0.8862 0.889 0.936 0.929 0.8863 0.889 0.936 0.929 0.8864 0.816 0.947 0.815 0.8965 0.816 0.947 0.815 0.8966 0.816 0.947 0.815 0.8967 0.816 0.947 0.815 0.8968 0.816 0.947 0.815 0.8969 0.869 0.917 0.862 0.83670 0.869 0.917 0.862 0.83671 0.869 0.917 0.862 0.83672 0.869 0.917 0.862 0.83673 0.869 0.917 0.862 0.83674 0.869 0.917 0.862 0.83675 0.869 0.917 0.862 0.83676 0.904 0.885 0.905 0.74477 0.904 0.885 0.905 0.74478 0.904 0.885 0.905 0.74479 0.904 0.885 0.905 0.74480 0.904 0.885 0.905 0.74481 0.86 0.902 0.919 0.90282 0.86 0.902 0.919 0.90283 0.86 0.902 0.919 0.90284 0.86 0.902 0.919 0.90285 0.86 0.902 0.919 0.90286 0.86 0.902 0.919 0.90287 0.903 0.866 0.77 0.85988 0.903 0.866 0.77 0.85989 0.903 0.866 0.77 0.85990 0.903 0.866 0.77 0.85991 0.903 0.866 0.77 0.85992 0.903 0.866 0.77 0.85993 0.903 0.866 0.77 0.85994 0.897 0.924 0.928 0.81795 0.897 0.924 0.928 0.81796 0.897 0.924 0.928 0.81797 0.897 0.924 0.928 0.81798 0.897 0.924 0.928 0.81799 0.872 0.922 0.885 0.818

  morale4 person1 person2 person30 0.95 0.825 0.641 0.6991 0.95 0.825 0.641 0.6992 0.918 0.8 0.715 0.8763 0.918 0.8 0.715 0.8764 0.918 0.8 0.715 0.8765 0.918 0.8 0.715 0.876

6 0.918 0.8 0.715 0.8767 0.927 0.577 0.845 0.8288 0.927 0.577 0.845 0.8289 0.927 0.577 0.845 0.828

10 0.927 0.577 0.845 0.82811 0.927 0.577 0.845 0.82812 0.891 0.689 0.852 0.81913 0.891 0.689 0.852 0.81914 0.891 0.689 0.852 0.81915 0.891 0.689 0.852 0.81916 0.891 0.689 0.852 0.81917 0.891 0.689 0.852 0.81918 0.889 0.858 0.784 0.83319 0.889 0.858 0.784 0.83320 0.889 0.858 0.784 0.83321 0.889 0.858 0.784 0.83322 0.889 0.858 0.784 0.83323 0.889 0.858 0.784 0.83324 0.942 0.812 0.561 0.85425 0.942 0.812 0.561 0.85426 0.942 0.812 0.561 0.85427 0.942 0.812 0.561 0.85428 0.942 0.812 0.561 0.85429 0.942 0.812 0.561 0.85430 0.915 0.773 0.727 0.67831 0.915 0.773 0.727 0.67832 0.915 0.773 0.727 0.67833 0.915 0.773 0.727 0.67834 0.915 0.773 0.727 0.67835 0.915 0.811 0.878 0.84736 0.915 0.811 0.878 0.84737 0.915 0.811 0.878 0.84738 0.915 0.811 0.878 0.84739 0.915 0.811 0.878 0.84740 0.915 0.811 0.878 0.84741 0.915 0.811 0.878 0.84742 0.928 0.787 0.853 0.82743 0.928 0.787 0.853 0.82744 0.928 0.787 0.853 0.82745 0.928 0.787 0.853 0.82746 0.899 0.811 0.778 0.87847 0.899 0.811 0.778 0.87848 0.899 0.811 0.778 0.87849 0.899 0.811 0.778 0.87850 0.899 0.811 0.778 0.87851 0.899 0.811 0.778 0.87852 0.876 0.819 0.784 0.80153 0.876 0.819 0.784 0.80154 0.876 0.819 0.784 0.80155 0.876 0.819 0.784 0.80156 0.876 0.819 0.784 0.80157 0.876 0.819 0.784 0.80158 0.916 0.71 0.681 0.8559 0.916 0.71 0.681 0.8560 0.916 0.71 0.681 0.8561 0.916 0.71 0.681 0.8562 0.916 0.71 0.681 0.85

63 0.916 0.71 0.681 0.8564 0.875 0.821 0.623 0.8265 0.875 0.821 0.623 0.8266 0.875 0.821 0.623 0.8267 0.875 0.821 0.623 0.8268 0.875 0.821 0.623 0.8269 0.762 0.835 0.876 0.77670 0.762 0.835 0.876 0.77671 0.762 0.835 0.876 0.77672 0.762 0.835 0.876 0.77673 0.762 0.835 0.876 0.77674 0.762 0.835 0.876 0.77675 0.762 0.835 0.876 0.77676 0.853 0.614 0.751 0.69777 0.853 0.614 0.751 0.69778 0.853 0.614 0.751 0.69779 0.853 0.614 0.751 0.69780 0.853 0.614 0.751 0.69781 0.878 0.864 0.656 0.84982 0.878 0.864 0.656 0.84983 0.878 0.864 0.656 0.84984 0.878 0.864 0.656 0.84985 0.878 0.864 0.656 0.84986 0.878 0.864 0.656 0.84987 0.886 0.911 0.866 0.83188 0.886 0.911 0.866 0.83189 0.886 0.911 0.866 0.83190 0.886 0.911 0.866 0.83191 0.886 0.911 0.866 0.83192 0.886 0.911 0.866 0.83193 0.886 0.911 0.866 0.83194 0.951 0.79 0.708 0.58995 0.951 0.79 0.708 0.58996 0.951 0.79 0.708 0.58997 0.951 0.79 0.708 0.58998 0.951 0.79 0.708 0.58999 0.875 0.801 0.872 0.859

  person40 0.8931 0.8932 0.8493 0.8494 0.8495 0.8496 0.8497 0.7028 0.7029 0.702

10 0.70211 0.70212 0.76613 0.76614 0.76615 0.76616 0.76617 0.766

18 0.82919 0.82920 0.82921 0.82922 0.82923 0.82924 0.76725 0.76726 0.76727 0.76728 0.76729 0.76730 0.77631 0.77632 0.77633 0.77634 0.77635 0.92336 0.92337 0.92338 0.92339 0.92340 0.92341 0.92342 0.79843 0.79844 0.79845 0.79846 0.79447 0.79448 0.79449 0.79450 0.79451 0.79452 0.84853 0.84854 0.84855 0.84856 0.84857 0.84858 0.79159 0.79160 0.79161 0.79162 0.79163 0.79164 0.7865 0.7866 0.7867 0.7868 0.7869 0.7670 0.7671 0.7672 0.7673 0.7674 0.76

75 0.7676 0.79377 0.79378 0.79379 0.79380 0.79381 0.88382 0.88383 0.88384 0.88385 0.88386 0.88387 0.8488 0.8489 0.8490 0.8491 0.8492 0.8493 0.8494 0.84595 0.84596 0.84597 0.84598 0.84599 0.797

Table of contents          

inner weights for each sample[ CSV-Version ]

 Leader ->

PersonBehave ->

PersonDevelop -> Person

Person -> Morale

0 0.084 0.683 0.163 0.5171 0.084 0.683 0.163 0.5172 0.181 0.279 0.162 0.5183 0.181 0.279 0.162 0.5184 0.181 0.279 0.162 0.5185 0.181 0.279 0.162 0.5186 0.181 0.279 0.162 0.5187 0.169 0.503 0.16 0.2588 0.169 0.503 0.16 0.2589 0.169 0.503 0.16 0.258

10 0.169 0.503 0.16 0.25811 0.169 0.503 0.16 0.25812 0.217 0.247 0.176 0.55613 0.217 0.247 0.176 0.55614 0.217 0.247 0.176 0.55615 0.217 0.247 0.176 0.55616 0.217 0.247 0.176 0.55617 0.217 0.247 0.176 0.55618 0.252 0.432 0.179 0.45219 0.252 0.432 0.179 0.45220 0.252 0.432 0.179 0.45221 0.252 0.432 0.179 0.45222 0.252 0.432 0.179 0.45223 0.252 0.432 0.179 0.452

24 0.304 0.247 0.154 0.59225 0.304 0.247 0.154 0.59226 0.304 0.247 0.154 0.59227 0.304 0.247 0.154 0.59228 0.304 0.247 0.154 0.59229 0.304 0.247 0.154 0.59230 0.18 0.282 0.37 0.47131 0.18 0.282 0.37 0.47132 0.18 0.282 0.37 0.47133 0.18 0.282 0.37 0.47134 0.18 0.282 0.37 0.47135 0.048 0.427 0.297 0.49736 0.048 0.427 0.297 0.49737 0.048 0.427 0.297 0.49738 0.048 0.427 0.297 0.49739 0.048 0.427 0.297 0.49740 0.048 0.427 0.297 0.49741 0.048 0.427 0.297 0.49742 0.304 0.338 0.115 0.48943 0.304 0.338 0.115 0.48944 0.304 0.338 0.115 0.48945 0.304 0.338 0.115 0.48946 0.117 0.67 0.219 0.33647 0.117 0.67 0.219 0.33648 0.117 0.67 0.219 0.33649 0.117 0.67 0.219 0.33650 0.117 0.67 0.219 0.33651 0.117 0.67 0.219 0.33652 0.494 0.01 0.187 0.48753 0.494 0.01 0.187 0.48754 0.494 0.01 0.187 0.48755 0.494 0.01 0.187 0.48756 0.494 0.01 0.187 0.48757 0.494 0.01 0.187 0.48758 0.017 0.549 0.132 0.55459 0.017 0.549 0.132 0.55460 0.017 0.549 0.132 0.55461 0.017 0.549 0.132 0.55462 0.017 0.549 0.132 0.55463 0.017 0.549 0.132 0.55464 0.565 0.109 0.202 0.64865 0.565 0.109 0.202 0.64866 0.565 0.109 0.202 0.64867 0.565 0.109 0.202 0.64868 0.565 0.109 0.202 0.64869 0.239 0.437 0.396 0.66870 0.239 0.437 0.396 0.66871 0.239 0.437 0.396 0.66872 0.239 0.437 0.396 0.66873 0.239 0.437 0.396 0.66874 0.239 0.437 0.396 0.66875 0.239 0.437 0.396 0.66876 0.231 0.25 0.191 0.28877 0.231 0.25 0.191 0.28878 0.231 0.25 0.191 0.28879 0.231 0.25 0.191 0.28880 0.231 0.25 0.191 0.288

81 0.189 0.144 0.246 0.52582 0.189 0.144 0.246 0.52583 0.189 0.144 0.246 0.52584 0.189 0.144 0.246 0.52585 0.189 0.144 0.246 0.52586 0.189 0.144 0.246 0.52587 0.129 0.528 0.196 0.54988 0.129 0.528 0.196 0.54989 0.129 0.528 0.196 0.54990 0.129 0.528 0.196 0.54991 0.129 0.528 0.196 0.54992 0.129 0.528 0.196 0.54993 0.129 0.528 0.196 0.54994 0.188 0.249 0.344 0.47895 0.188 0.249 0.344 0.47896 0.188 0.249 0.344 0.47897 0.188 0.249 0.344 0.47898 0.188 0.249 0.344 0.47899 0.432 0.247 0.144 0.514

Table of contents          

Lisrel 8.30

Lisrel 8.30 dipakai karena Lisrel 9.10 Student Edition tidak dapat mengolah

jumlah indikator sebanyak 21. Hal ini tidak menghalangi usaha perbandingan

antara Lisrel dan SmartPLS. Hasil pengolahan data Employee.sav adalah

sebagai berikut:

TI FRONTLINE EMPLOYEE FULL MODEL Observed VariablesMORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4 LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 DEVELOP2 DEVELOP3 DEVELOP4

Covariance Matrix 1.31 0.82 0.96 0.90 0.69 1.26 1.03 0.78 0.86 1.20 0.31 0.27 0.30 0.27 0.76 0.34 0.29 0.33 0.32 0.43 0.92 0.36 0.31 0.32 0.34 0.41 0.47 0.75 0.43 0.35 0.44 0.40 0.43 0.51 0.48 1.00 0.68 0.49 0.56 0.64 0.28 0.34 0.29 0.36 1.22 0.84 0.61 0.66 0.75 0.30 0.33 0.32 0.39 0.88 1.20 0.59 0.43 0.52 0.54 0.27 0.35 0.28 0.35 0.90 0.80 1.22 0.51 0.32 0.46 0.44 0.22 0.18 0.18 0.22 0.74 0.68 0.73 1.00 0.61 0.42 0.50 0.54 0.24 0.33 0.27 0.33 0.83 0.77 0.80 0.64 1.06 0.32 0.28 0.39 0.30 0.28 0.35 0.32 0.39 0.29 0.31 0.30 0.24 0.26 0.98 0.28 0.21 0.26 0.26 0.22 0.29 0.22 0.28 0.30 0.28 0.29 0.25 0.27 0.59 0.98 0.31 0.26 0.36 0.30 0.32 0.35 0.33 0.38 0.32 0.33 0.30 0.25 0.28 0.72 0.66 1.11 0.33 0.31 0.41 0.33 0.34 0.38 0.36 0.44 0.39 0.36 0.37 0.31 0.30 0.70 0.63 0.79 1.19 0.32 0.32 0.36 0.36 0.20 0.27 0.18 0.31 0.41 0.36 0.39 0.29 0.34 0.36 0.28 0.32 0.41 1.12 0.30 0.32 0.40 0.33 0.25 0.30 0.23 0.30 0.45 0.40 0.40 0.35 0.36 0.35 0.27 0.34 0.49 0.61 1.24 0.24 0.24 0.24 0.23 0.17 0.23 0.16 0.25 0.34 0.32 0.29 0.22 0.27 0.25 0.21 0.21 0.29 0.45 0.45 1.04 0.42 0.36 0.44 0.43 0.26 0.37 0.24 0.41 0.56 0.46 0.48 0.34 0.42 0.40 0.29 0.36 0.41 0.73 0.70 0.55 1.08 Means 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Sample Size = 538Latent Variables MORALE PERSON LEADER BEHAVE DEVELOP RelationshipsMORALE1 = MORALE MORALE2 = MORALE MORALE3 = MORALE MORALE4 = MORALE PERSON1 = PERSON PERSON2 = PERSON PERSON3 = PERSON PERSON4 = PERSON LEADER1 = LEADER LEADER2 = LEADER LEADER3 = LEADER LEADER4 = LEADER LEADER5 = LEADER BEHAVE1 = BEHAVE BEHAVE2 = BEHAVE BEHAVE3 = BEHAVE BEHAVE4 = BEHAVE DEVELOP1 = DEVELOP DEVELOP2 = DEVELOP DEVELOP3 = DEVELOP DEVELOP4 = DEVELOP MORALE = PERSON PERSON = LEADER BEHAVE DEVELOP Set the Variance of LEADER to 1.00Set the Variance of BEHAVE to 1.00Set the Variance of DEVELOP to 1.00Set the Error Variance of MORALE to 0.64Set the Error Variance of PERSON to 0.50Path Diagram

Print ResidualsMethod of Estimation: Maximum LikelihoodEnd of Problem

DATE: 7/28/2013 TIME: 5:41

L I S R E L 8.30

BY

Karl G. Jöreskog & Dag Sörbom

This program is published exclusively by Scientific Software International, Inc. 7383 N. Lincoln Avenue, Suite 100 Chicago, IL 60646-1704, U.S.A. Phone: (800)247-6113, (847)675-0720, Fax: (847)675-2140 Copyright by Scientific Software International, Inc., 1981-99 Use of this program is subject to the terms specified in the Universal Copyright Convention. Website: www.ssicentral.com

The following lines were read from file C:\LISREL83\LS8EX\EMPLOYEE.SPJ:

TI FRONTLINE EMPLOYEE FULL MODEL Observed Variables MORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4 LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 DEVELOP2 DEVELOP3 DEVELOP4 Covariance Matrix 1.31 0.82 0.96 0.90 0.69 1.26 1.03 0.78 0.86 1.20 0.31 0.27 0.30 0.27 0.76 0.34 0.29 0.33 0.32 0.43 0.92 0.36 0.31 0.32 0.34 0.41 0.47 0.75 0.43 0.35 0.44 0.40 0.43 0.51 0.48 1.00 0.68 0.49 0.56 0.64 0.28 0.34 0.29 0.36 1.22 0.84 0.61 0.66 0.75 0.30 0.33 0.32 0.39 0.88 1.20 0.59 0.43 0.52 0.54 0.27 0.35 0.28 0.35 0.90 0.80 1.22 0.51 0.32 0.46 0.44 0.22 0.18 0.18 0.22 0.74 0.68 0.73 1.00 0.61 0.42 0.50 0.54 0.24 0.33 0.27 0.33 0.83 0.77 0.80 0.64 1.06 0.32 0.28 0.39 0.30 0.28 0.35 0.32 0.39 0.29 0.31 0.30 0.24 0.26 0.98 0.28 0.21 0.26 0.26 0.22 0.29 0.22 0.28 0.30 0.28 0.29 0.25 0.27 0.59 0.98 0.31 0.26 0.36 0.30 0.32 0.35 0.33 0.38 0.32 0.33 0.30 0.25 0.28 0.72 0.66 1.11 0.33 0.31 0.41 0.33 0.34 0.38 0.36 0.44 0.39 0.36 0.37 0.31 0.30 0.70 0.63 0.79 1.19 0.32 0.32 0.36 0.36 0.20 0.27 0.18 0.31 0.41 0.36 0.39 0.29 0.34 0.36 0.28 0.32 0.41 1.12 0.30 0.32 0.40 0.33 0.25 0.30 0.23 0.30 0.45 0.40 0.40 0.35 0.36 0.35 0.27 0.34 0.49 0.61 1.24 0.24 0.24 0.24 0.23 0.17 0.23 0.16 0.25 0.34 0.32 0.29 0.22 0.27 0.25 0.21 0.21 0.29 0.45 0.45 1.04 0.42 0.36 0.44 0.43 0.26 0.37 0.24 0.41 0.56 0.46 0.48 0.34 0.42 0.40 0.29 0.36 0.41 0.73 0.70 0.55 1.08 Means 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Sample Size = 538 Latent Variables MORALE PERSON LEADER BEHAVE DEVELOP Relationships MORALE1 = MORALE MORALE2 = MORALE MORALE3 = MORALE MORALE4 = MORALE PERSON1 = PERSON PERSON2 = PERSON PERSON3 = PERSON PERSON4 = PERSON LEADER1 = LEADER LEADER2 = LEADER LEADER3 = LEADER LEADER4 = LEADER LEADER5 = LEADER BEHAVE1 = BEHAVE BEHAVE2 = BEHAVE BEHAVE3 = BEHAVE BEHAVE4 = BEHAVE DEVELOP1 = DEVELOP

DEVELOP2 = DEVELOP DEVELOP3 = DEVELOP DEVELOP4 = DEVELOP MORALE = PERSON PERSON = LEADER BEHAVE DEVELOP Set the Variance of LEADER to 1.00 Set the Variance of BEHAVE to 1.00 Set the Variance of DEVELOP to 1.00 Set the Error Variance of MORALE to 0.64 Set the Error Variance of PERSON to 0.50 Path Diagram Print Residuals Method of Estimation: Maximum Likelihood End of Problem

Sample Size = 538

TI FRONTLINE EMPLOYEE FULL MODEL

Covariance Matrix to be Analyzed

MORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 -------- -------- -------- -------- -------- -------- MORALE1 1.31 MORALE2 0.82 0.96 MORALE3 0.90 0.69 1.26 MORALE4 1.03 0.78 0.86 1.20 PERSON1 0.31 0.27 0.30 0.27 0.76 PERSON2 0.34 0.29 0.33 0.32 0.43 0.92 PERSON3 0.36 0.31 0.32 0.34 0.41 0.47 PERSON4 0.43 0.35 0.44 0.40 0.43 0.51 LEADER1 0.68 0.49 0.56 0.64 0.28 0.34 LEADER2 0.84 0.61 0.66 0.75 0.30 0.33 LEADER3 0.59 0.43 0.52 0.54 0.27 0.35 LEADER4 0.51 0.32 0.46 0.44 0.22 0.18 LEADER5 0.61 0.42 0.50 0.54 0.24 0.33 BEHAVE1 0.32 0.28 0.39 0.30 0.28 0.35 BEHAVE2 0.28 0.21 0.26 0.26 0.22 0.29 BEHAVE3 0.31 0.26 0.36 0.30 0.32 0.35 BEHAVE4 0.33 0.31 0.41 0.33 0.34 0.38 DEVELOP1 0.32 0.32 0.36 0.36 0.20 0.27 DEVELOP2 0.30 0.32 0.40 0.33 0.25 0.30 DEVELOP3 0.24 0.24 0.24 0.23 0.17 0.23 DEVELOP4 0.42 0.36 0.44 0.43 0.26 0.37

Covariance Matrix to be Analyzed

PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4 -------- -------- -------- -------- -------- -------- PERSON3 0.75 PERSON4 0.48 1.00 LEADER1 0.29 0.36 1.22 LEADER2 0.32 0.39 0.88 1.20 LEADER3 0.28 0.35 0.90 0.80 1.22 LEADER4 0.18 0.22 0.74 0.68 0.73 1.00 LEADER5 0.27 0.33 0.83 0.77 0.80 0.64 BEHAVE1 0.32 0.39 0.29 0.31 0.30 0.24 BEHAVE2 0.22 0.28 0.30 0.28 0.29 0.25 BEHAVE3 0.33 0.38 0.32 0.33 0.30 0.25 BEHAVE4 0.36 0.44 0.39 0.36 0.37 0.31 DEVELOP1 0.18 0.31 0.41 0.36 0.39 0.29 DEVELOP2 0.23 0.30 0.45 0.40 0.40 0.35 DEVELOP3 0.16 0.25 0.34 0.32 0.29 0.22 DEVELOP4 0.24 0.41 0.56 0.46 0.48 0.34

Covariance Matrix to be Analyzed

LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 -------- -------- -------- -------- -------- -------- LEADER5 1.06 BEHAVE1 0.26 0.98 BEHAVE2 0.27 0.59 0.98 BEHAVE3 0.28 0.72 0.66 1.11

BEHAVE4 0.30 0.70 0.63 0.79 1.19 DEVELOP1 0.34 0.36 0.28 0.32 0.41 1.12 DEVELOP2 0.36 0.35 0.27 0.34 0.49 0.61 DEVELOP3 0.27 0.25 0.21 0.21 0.29 0.45 DEVELOP4 0.42 0.40 0.29 0.36 0.41 0.73

Covariance Matrix to be Analyzed

DEVELOP2 DEVELOP3 DEVELOP4 -------- -------- -------- DEVELOP2 1.24 DEVELOP3 0.45 1.04 DEVELOP4 0.70 0.55 1.08

TI FRONTLINE EMPLOYEE FULL MODEL

Number of Iterations = 11

LISREL Estimates (Maximum Likelihood) MORALE1 = 1.04*MORALE, Errorvar.= 0.23 , R² = 0.82 (0.043) (0.023) 23.97 9.98 MORALE2 = 0.80*MORALE, Errorvar.= 0.33 , R² = 0.65 (0.039) (0.024) 20.66 13.86 MORALE3 = 0.88*MORALE, Errorvar.= 0.50 , R² = 0.61 (0.045) (0.035) 19.67 14.35 MORALE4 = 0.99*MORALE, Errorvar.= 0.23 , R² = 0.81 (0.042) (0.022) 23.79 10.36 PERSON1 = 0.59*PERSON, Errorvar.= 0.41 , R² = 0.46 (0.039) (0.029) 15.26 13.88 PERSON2 = 0.69*PERSON, Errorvar.= 0.44 , R² = 0.52 (0.043) (0.033) 16.24 13.21 PERSON3 = 0.64*PERSON, Errorvar.= 0.33 , R² = 0.55 (0.038) (0.026) 16.77 12.75 PERSON4 = 0.73*PERSON, Errorvar.= 0.46 , R² = 0.54 (0.044) (0.036) 16.51 12.98 LEADER1 = 0.98*LEADER, Errorvar.= 0.26 , R² = 0.78 (0.038) (0.023) 25.62 11.34 LEADER2 = 0.90*LEADER, Errorvar.= 0.40 , R² = 0.67 (0.040) (0.029) 22.59 13.59 LEADER3 = 0.92*LEADER, Errorvar.= 0.37 , R² = 0.70 (0.039) (0.028) 23.42 13.13 LEADER4 = 0.76*LEADER, Errorvar.= 0.43 , R² = 0.57 (0.038) (0.029) 20.19 14.53 LEADER5 = 0.85*LEADER, Errorvar.= 0.33 , R² = 0.69 (0.037) (0.025)

23.09 13.32 BEHAVE1 = 0.81*BEHAVE, Errorvar.= 0.33 , R² = 0.67 (0.037) (0.027) 22.09 12.17 BEHAVE2 = 0.72*BEHAVE, Errorvar.= 0.46 , R² = 0.53 (0.038) (0.033) 18.83 14.02 BEHAVE3 = 0.89*BEHAVE, Errorvar.= 0.32 , R² = 0.71 (0.038) (0.029) 23.25 11.11 BEHAVE4 = 0.88*BEHAVE, Errorvar.= 0.41 , R² = 0.66 (0.040) (0.033) 21.84 12.37 DEVELOP1 = 0.78*DEVELOP, Errorvar.= 0.50 , R² = 0.55 (0.042) (0.038) 18.89 13.15 DEVELOP2 = 0.77*DEVELOP, Errorvar.= 0.65 , R² = 0.48 (0.045) (0.046) 17.20 14.06 DEVELOP3 = 0.59*DEVELOP, Errorvar.= 0.69 , R² = 0.33 (0.043) (0.046) 13.77 15.16 DEVELOP4 = 0.92*DEVELOP, Errorvar.= 0.23 , R² = 0.79 (0.038) (0.031) 24.26 7.24 MORALE = 0.59*PERSON, Errorvar.= 0.64, R² = 0.35 (0.055) 10.83 PERSON = 0.31*LEADER + 0.40*BEHAVE + 0.16*DEVELOP, Errorvar.= 0.50, R² = 0.50 (0.052) (0.054) (0.056) 5.99 7.44 2.82

Correlation Matrix of Independent Variables

LEADER BEHAVE DEVELOP -------- -------- -------- LEADER 1.00 BEHAVE 0.41 1.00 (0.04) 9.92 DEVELOP 0.56 0.50 1.00 (0.04) (0.04) 15.83 12.94

Covariance Matrix of Latent Variables

MORALE PERSON LEADER BEHAVE DEVELOP -------- -------- -------- -------- -------- MORALE 0.99 PERSON 0.59 1.00 LEADER 0.33 0.56 1.00 BEHAVE 0.36 0.60 0.41 1.00 DEVELOP 0.31 0.53 0.56 0.50 1.00

Goodness of Fit Statistics

Degrees of Freedom = 182 Minimum Fit Function Chi-Square = 476.59 (P = 0.0) Normal Theory Weighted Least Squares Chi-Square = 443.60 (P = 0.0) Estimated Non-centrality Parameter (NCP) = 261.60 90 Percent Confidence Interval for NCP = (203.63 ; 327.26) Minimum Fit Function Value = 0.89 Population Discrepancy Function Value (F0) = 0.49 90 Percent Confidence Interval for F0 = (0.38 ; 0.61) Root Mean Square Error of Approximation (RMSEA) = 0.052 90 Percent Confidence Interval for RMSEA = (0.046 ; 0.058) P-Value for Test of Close Fit (RMSEA < 0.05) = 0.31 Expected Cross-Validation Index (ECVI) = 1.01 90 Percent Confidence Interval for ECVI = (0.90 ; 1.13) ECVI for Saturated Model = 0.86 ECVI for Independence Model = 13.13 Chi-Square for Independence Model with 210 Degrees of Freedom = 7010.32 Independence AIC = 7052.32 Model AIC = 541.60 Saturated AIC = 462.00 Independence CAIC = 7163.36 Model CAIC = 800.70 Saturated CAIC = 1683.50 Root Mean Square Residual (RMR) = 0.099 Standardized RMR = 0.085 Goodness of Fit Index (GFI) = 0.93 Adjusted Goodness of Fit Index (AGFI) = 0.91 Parsimony Goodness of Fit Index (PGFI) = 0.73 Normed Fit Index (NFI) = 0.93 Non-Normed Fit Index (NNFI) = 0.95 Parsimony Normed Fit Index (PNFI) = 0.81 Comparative Fit Index (CFI) = 0.96 Incremental Fit Index (IFI) = 0.96 Relative Fit Index (RFI) = 0.92 Critical N (CN) = 259.37

TI FRONTLINE EMPLOYEE FULL MODEL

Fitted Covariance Matrix

MORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 -------- -------- -------- -------- -------- -------- MORALE1 1.31 MORALE2 0.82 0.96 MORALE3 0.91 0.69 1.26 MORALE4 1.02 0.78 0.86 1.20 PERSON1 0.37 0.28 0.31 0.35 0.76 PERSON2 0.43 0.33 0.36 0.41 0.41 0.92 PERSON3 0.40 0.30 0.34 0.38 0.38 0.45 PERSON4 0.45 0.35 0.38 0.43 0.43 0.51 LEADER1 0.34 0.26 0.29 0.32 0.33 0.38 LEADER2 0.31 0.24 0.26 0.30 0.30 0.35 LEADER3 0.32 0.25 0.27 0.31 0.31 0.36 LEADER4 0.26 0.20 0.22 0.25 0.25 0.30 LEADER5 0.30 0.23 0.25 0.28 0.29 0.33 BEHAVE1 0.30 0.23 0.25 0.29 0.29 0.34 BEHAVE2 0.27 0.21 0.23 0.26 0.26 0.30 BEHAVE3 0.33 0.25 0.28 0.32 0.32 0.37 BEHAVE4 0.33 0.25 0.28 0.31 0.32 0.37 DEVELOP1 0.26 0.20 0.22 0.25 0.25 0.29 DEVELOP2 0.25 0.19 0.21 0.24 0.24 0.28 DEVELOP3 0.19 0.15 0.16 0.18 0.19 0.22 DEVELOP4 0.30 0.23 0.26 0.29 0.29 0.34

Fitted Covariance Matrix

PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4 -------- -------- -------- -------- -------- -------- PERSON3 0.75 PERSON4 0.47 1.00 LEADER1 0.35 0.40 1.22 LEADER2 0.32 0.37 0.88 1.20 LEADER3 0.34 0.38 0.90 0.83 1.22 LEADER4 0.27 0.31 0.74 0.68 0.70 1.00 LEADER5 0.31 0.35 0.83 0.76 0.79 0.65 BEHAVE1 0.31 0.36 0.32 0.29 0.30 0.25 BEHAVE2 0.28 0.32 0.29 0.26 0.27 0.22 BEHAVE3 0.35 0.39 0.35 0.32 0.33 0.27 BEHAVE4 0.34 0.39 0.35 0.32 0.33 0.27 DEVELOP1 0.27 0.31 0.43 0.39 0.40 0.33 DEVELOP2 0.26 0.30 0.42 0.38 0.40 0.32 DEVELOP3 0.20 0.23 0.32 0.29 0.30 0.25 DEVELOP4 0.32 0.36 0.50 0.46 0.48 0.39

Fitted Covariance Matrix

LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 -------- -------- -------- -------- -------- -------- LEADER5 1.06 BEHAVE1 0.28 0.98 BEHAVE2 0.25 0.58 0.98 BEHAVE3 0.31 0.72 0.64 1.11 BEHAVE4 0.31 0.71 0.64 0.79 1.19 DEVELOP1 0.37 0.32 0.28 0.35 0.35 1.12 DEVELOP2 0.37 0.31 0.28 0.34 0.34 0.60 DEVELOP3 0.28 0.24 0.21 0.26 0.26 0.46 DEVELOP4 0.44 0.37 0.33 0.41 0.41 0.72

Fitted Covariance Matrix

DEVELOP2 DEVELOP3 DEVELOP4 -------- -------- -------- DEVELOP2 1.24 DEVELOP3 0.45 1.04 DEVELOP4 0.71 0.54 1.08

Fitted Residuals

MORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 -------- -------- -------- -------- -------- -------- MORALE1 0.00 MORALE2 0.00 0.00 MORALE3 -0.01 0.00 0.00 MORALE4 0.01 0.00 0.00 0.00 PERSON1 -0.06 -0.01 -0.01 -0.08 0.00 PERSON2 -0.09 -0.04 -0.03 -0.09 0.02 0.00 PERSON3 -0.04 0.01 -0.02 -0.04 0.03 0.02 PERSON4 -0.02 0.00 0.06 -0.03 0.00 0.00 LEADER1 0.34 0.23 0.27 0.32 -0.05 -0.04 LEADER2 0.53 0.37 0.40 0.45 0.00 -0.02 LEADER3 0.27 0.18 0.25 0.23 -0.04 -0.01 LEADER4 0.25 0.12 0.24 0.19 -0.03 -0.12 LEADER5 0.31 0.19 0.25 0.26 -0.05 0.00 BEHAVE1 0.02 0.05 0.14 0.01 -0.01 0.01 BEHAVE2 0.01 0.00 0.03 0.00 -0.04 -0.01 BEHAVE3 -0.02 0.01 0.08 -0.02 0.00 -0.02 BEHAVE4 0.00 0.06 0.13 0.02 0.02 0.01 DEVELOP1 0.06 0.12 0.14 0.11 -0.05 -0.02 DEVELOP2 0.05 0.13 0.19 0.09 0.01 0.02 DEVELOP3 0.05 0.09 0.08 0.05 -0.02 0.01 DEVELOP4 0.12 0.13 0.18 0.14 -0.03 0.03

Fitted Residuals

PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4

-------- -------- -------- -------- -------- -------- PERSON3 0.00 PERSON4 0.01 0.00 LEADER1 -0.06 -0.04 0.00 LEADER2 0.00 0.02 0.00 0.00 LEADER3 -0.06 -0.03 0.00 -0.03 0.00 LEADER4 -0.09 -0.09 0.00 0.00 0.03 0.00 LEADER5 -0.04 -0.02 0.00 0.01 0.01 -0.01 BEHAVE1 0.01 0.03 -0.03 0.02 0.00 -0.01 BEHAVE2 -0.06 -0.04 0.01 0.02 0.02 0.03 BEHAVE3 -0.02 -0.01 -0.03 0.01 -0.03 -0.02 BEHAVE4 0.02 0.05 0.04 0.04 0.04 0.04 DEVELOP1 -0.09 0.00 -0.02 -0.03 -0.01 -0.04 DEVELOP2 -0.03 0.00 0.03 0.02 0.00 0.03 DEVELOP3 -0.04 0.02 0.02 0.03 -0.01 -0.03 DEVELOP4 -0.08 0.05 0.06 0.00 0.00 -0.05

Fitted Residuals

LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 -------- -------- -------- -------- -------- -------- LEADER5 0.00 BEHAVE1 -0.02 0.00 BEHAVE2 0.02 0.01 0.00 BEHAVE3 -0.03 0.00 0.02 0.00 BEHAVE4 -0.01 -0.01 -0.01 0.00 0.00 DEVELOP1 -0.03 0.04 0.00 -0.03 0.06 0.00 DEVELOP2 -0.01 0.04 -0.01 0.00 0.15 0.01 DEVELOP3 -0.01 0.01 0.00 -0.05 0.03 -0.01 DEVELOP4 -0.02 0.03 -0.04 -0.05 0.00 0.01

Fitted Residuals

DEVELOP2 DEVELOP3 DEVELOP4 -------- -------- -------- DEVELOP2 0.00 DEVELOP3 0.00 0.00 DEVELOP4 -0.01 0.01 0.00

Summary Statistics for Fitted Residuals

Smallest Fitted Residual = -0.12 Median Fitted Residual = 0.00 Largest Fitted Residual = 0.53

Stemleaf Plot

- 1|2 - 0|99999886666555555 - 0|444444444444333333333333333222222222222221111111111111111111000000000000+41 0|11111111111111111111222222222222222233333333333444444 0|555555666668899 1|1222333444 1|588999 2|334 2|555677 3|124 3|7 4|0 4|5 5|3

Standardized Residuals

MORALE1 MORALE2 MORALE3 MORALE4 PERSON1 PERSON2 -------- -------- -------- -------- -------- -------- MORALE1 - - MORALE2 -0.30 - - MORALE3 -0.72 -0.16 - - MORALE4 1.75 -0.28 -0.25 - - PERSON1 -2.29 -0.40 -0.29 -3.31 - -

PERSON2 -3.47 -1.44 -0.99 -3.55 1.39 - - PERSON3 -1.74 0.28 -0.57 -1.83 2.34 2.04 PERSON4 -0.88 0.17 1.90 -1.23 -0.34 0.18 LEADER1 8.73 6.42 6.53 8.45 -1.86 -1.59 LEADER2 12.97 10.05 9.23 11.59 0.04 -0.67 LEADER3 6.63 5.00 5.80 6.02 -1.45 -0.37 LEADER4 6.37 3.41 5.89 5.10 -1.25 -4.14 LEADER5 8.25 5.59 6.21 7.06 -1.80 -0.10 BEHAVE1 0.52 1.52 3.57 0.38 -0.42 0.49 BEHAVE2 0.27 0.12 0.83 0.10 -1.52 -0.45 BEHAVE3 -0.60 0.19 2.01 -0.45 0.04 -0.91 BEHAVE4 0.01 1.61 3.14 0.43 0.90 0.38 DEVELOP1 1.51 3.31 3.31 2.89 -1.70 -0.64 DEVELOP2 1.06 3.18 4.04 2.09 0.25 0.52 DEVELOP3 1.08 2.42 1.75 1.11 -0.51 0.40 DEVELOP4 3.18 3.79 4.67 4.01 -1.35 1.28

Standardized Residuals

PERSON3 PERSON4 LEADER1 LEADER2 LEADER3 LEADER4 -------- -------- -------- -------- -------- -------- PERSON3 - - PERSON4 0.63 - - LEADER1 -2.90 -1.65 - - LEADER2 -0.20 0.70 0.48 - - LEADER3 -2.28 -1.11 -0.45 -2.29 - - LEADER4 -3.87 -3.23 -0.07 0.10 2.18 - - LEADER5 -1.73 -0.83 -0.59 0.45 0.98 -0.53 BEHAVE1 0.24 1.28 -1.17 0.57 -0.10 -0.28 BEHAVE2 -2.57 -1.44 0.45 0.54 0.60 0.89 BEHAVE3 -0.78 -0.57 -1.28 0.23 -1.17 -0.79 BEHAVE4 0.67 1.75 1.38 1.22 1.23 1.21 DEVELOP1 -3.42 0.13 -0.64 -1.04 -0.49 -1.37 DEVELOP2 -1.16 0.01 0.96 0.46 0.10 0.74 DEVELOP3 -1.43 0.60 0.56 0.73 -0.39 -0.86 DEVELOP4 -3.78 2.08 2.96 -0.07 0.16 -2.04

Standardized Residuals

LEADER5 BEHAVE1 BEHAVE2 BEHAVE3 BEHAVE4 DEVELOP1 -------- -------- -------- -------- -------- -------- LEADER5 - - BEHAVE1 -0.74 - - BEHAVE2 0.66 0.46 - - BEHAVE3 -1.04 -0.01 1.49 - - BEHAVE4 -0.19 -1.56 -0.67 0.33 - - DEVELOP1 -1.18 1.47 -0.15 -1.06 1.96 - - DEVELOP2 -0.19 1.19 -0.25 -0.11 4.20 0.37 DEVELOP3 -0.32 0.34 -0.10 -1.57 0.80 -0.58 DEVELOP4 -0.93 1.20 -1.75 -2.57 0.02 0.94

Standardized Residuals

DEVELOP2 DEVELOP3 DEVELOP4 -------- -------- -------- DEVELOP2 - - DEVELOP3 -0.11 - - DEVELOP4 -1.34 0.64 - -

Summary Statistics for Standardized Residuals

Smallest Standardized Residual = -4.14 Median Standardized Residual = 0.00 Largest Standardized Residual = 12.97

Stemleaf Plot

- 4|1 - 3|9855432 - 2|9663330 - 1|9887777766654444443332222211000

- 0|999988877776666666555554444433333332222211111111000000000000000000000000000 0|1111122222223333444444555555556666667777788999 1|00111222223344555567789 2|000112349 3|012233468 4|0027 5|01689 6|024456 7|1 8|257 9|2 10|1 11|6 12| 13|0 Largest Negative Standardized Residuals Residual for PERSON1 and MORALE4 -3.31 Residual for PERSON2 and MORALE1 -3.47 Residual for PERSON2 and MORALE4 -3.55 Residual for LEADER1 and PERSON3 -2.90 Residual for LEADER4 and PERSON2 -4.14 Residual for LEADER4 and PERSON3 -3.87 Residual for LEADER4 and PERSON4 -3.23 Residual for DEVELOP1 and PERSON3 -3.42 Residual for DEVELOP4 and PERSON3 -3.78 Largest Positive Standardized Residuals Residual for LEADER1 and MORALE1 8.73 Residual for LEADER1 and MORALE2 6.42 Residual for LEADER1 and MORALE3 6.53 Residual for LEADER1 and MORALE4 8.45 Residual for LEADER2 and MORALE1 12.97 Residual for LEADER2 and MORALE2 10.05 Residual for LEADER2 and MORALE3 9.23 Residual for LEADER2 and MORALE4 11.59 Residual for LEADER3 and MORALE1 6.63 Residual for LEADER3 and MORALE2 5.00 Residual for LEADER3 and MORALE3 5.80 Residual for LEADER3 and MORALE4 6.02 Residual for LEADER4 and MORALE1 6.37 Residual for LEADER4 and MORALE2 3.41 Residual for LEADER4 and MORALE3 5.89 Residual for LEADER4 and MORALE4 5.10 Residual for LEADER5 and MORALE1 8.25 Residual for LEADER5 and MORALE2 5.59 Residual for LEADER5 and MORALE3 6.21 Residual for LEADER5 and MORALE4 7.06 Residual for BEHAVE1 and MORALE3 3.57 Residual for BEHAVE4 and MORALE3 3.14 Residual for DEVELOP1 and MORALE2 3.31 Residual for DEVELOP1 and MORALE3 3.31 Residual for DEVELOP1 and MORALE4 2.89 Residual for DEVELOP2 and MORALE2 3.18 Residual for DEVELOP2 and MORALE3 4.04 Residual for DEVELOP2 and BEHAVE4 4.20 Residual for DEVELOP4 and MORALE1 3.18 Residual for DEVELOP4 and MORALE2 3.79 Residual for DEVELOP4 and MORALE3 4.67 Residual for DEVELOP4 and MORALE4 4.01 Residual for DEVELOP4 and LEADER1 2.96

The Modification Indices Suggest to Add the Path to from Decrease in Chi-Square New Estimate PERSON2 MORALE 13.1 -0.17 PERSON MORALE 76.8 -0.63 MORALE LEADER 141.5 0.62 MORALE DEVELOP 28.9 0.28

The Modification Indices Suggest to Add an Error Covariance Between and Decrease in Chi-Square New Estimate PERSON MORALE 76.8 -0.40 LEADER2 MORALE1 13.5 0.06

LEADER4 PERSON2 9.8 -0.07 DEVELOP2 BEHAVE4 18.2 0.11 DEVELOP4 LEADER1 11.9 0.06

The Problem used 62832 Bytes (= 0.1% of Available Workspace)

Time used: 0.141 Seconds