Term
Industrial/Organizational Psychology |
|
Definition
applied field that is concerned with the development and application of scientific principles to the workplace. *Do not deal with employee's emotional/personal problems. |
|
|
Term
I/O Psychologist's Settings |
|
Definition
*39% Universities *25% Consulting Firms *17% Private Companies *10% Other *9% Government |
|
|
Term
|
Definition
managing organizational efficiency. Older branch. *efficient job design, employee selection, employee training, performance appraisal. |
|
|
Term
|
Definition
Understanding behavior and enhancing the well-being of employees in the workplace. *Employee attitudes, employee behavior, job stress, supervisory practices. |
|
|
Term
|
Definition
knowledge of being in an experiment->increase in performance.*Lighting exp. no effect, repeated trials improved performance. *Social factors *O Side |
|
|
Term
|
Definition
Society for Industrial and Organizational Psychology. |
|
|
Term
|
Definition
manipulated by the researcher i.e. stressors |
|
|
Term
|
Definition
assessed in response to the IVs. i.e. blood pressure *must be a behavior |
|
|
Term
|
Definition
conclusions of a study can be extended to other groups of people, orgs, settings or situations. *Lab settings often too dissimilar to field settings. |
|
|
Term
|
Definition
rule out alternative explanations for results. |
|
|
Term
|
Definition
anything a researcher cannot control or account for. |
|
|
Term
|
Definition
one or more IVs and one or more DVS, random assignment. *Adv: precise control, ID cause and effect Dadv: ethical, limitations, artificiality, confounders, bias. |
|
|
Term
|
Definition
one or more features of a true exp have been compromised. |
|
|
Term
|
Definition
Adv: manipulate IVs in natural setting, realistic, more generalizable, real workers and real setting, can suggest causality. Dadv: less control over confounders, resistance by employees, can't ethically manipulate things |
|
|
Term
Classical Measurement Theory |
|
Definition
every observation of a variable can be divided into two components: true score (variable of interest) and error (random influences). *observation = true score + error |
|
|
Term
|
Definition
the consistency of measurement across repeated observations of a variable on the same subject. |
|
|
Term
internal consistency reliability |
|
Definition
how well the multiple measures on the same subject relate to one another. |
|
|
Term
|
Definition
inferences made about what an observed score measures or represents. * concerned with the measuring device. |
|
|
Term
|
Definition
interpretation of a measure's meaning. *confidence in our interpretation of the measure represents. |
|
|
Term
|
Definition
the measure appears to assess what it is supposed to. |
|
|
Term
|
Definition
measure assesses entire variable. |
|
|
Term
criterion-related validity |
|
Definition
measure related to other measures they should relate to. |
|
|
Term
|
Definition
a method for describing jobs and/or the human attributes necessary to perform them. *3 elements: 1. systematic procedure. 2. job is broken into smaller units. 3. analysis results in written product. |
|
|
Term
|
Definition
provides info about the nature of tasks done on the job. describe tasks or characteristics of tasks. |
|
|
Term
|
Definition
provides a description of the characteristics (KSAOs) necessary for a person to successfully perform a particular job. |
|
|
Term
|
Definition
Knowledge: what a person needs to know to do a job. Skill: what a person is able to do on the job. Ability: a person's capability to do a job or learn to do job tasks. Other personal characteristics: anything relevant to the job that is not already covered. |
|
|
Term
|
Definition
-career development -legal defense -performance appraisal -Recruitment and selection -Training -Vocational counseling -Job classification, description, design -Research |
|
|
Term
|
Definition
job analysts, -SMEs: subject matter experts: people with details knowledge about the content and requirements of a job. employees, supervisors. -trained observers |
|
|
Term
approaches to collect info |
|
Definition
-perform the job -observe employees on the job -interview/survey SMEs p. 64 adv/disadv |
|
|
Term
|
Definition
Job Components Inventory: match job requirements to worker characteristics. 5 components: use of tools and equip, perceptual and physical requirements, mathematics, communication, decision making and respobsibility |
|
|
Term
|
Definition
Functional Job Analysis: observations and interviews of SMEs to provide description of a job. *DOT and ONET |
|
|
Term
|
Definition
Dictionary of Occupational Titles: produced by the US Dep of Labor. Contains job analysis info on 20K + jobs. |
|
|
Term
|
Definition
Occupational Information Network: computer-based source of job descriptions. Six domains: experience requirements, worker requirements, worker characteristics, occupation requirements, occupation-specific info, occupation characteristics. |
|
|
Term
|
Definition
Position Analysis Questionnaire: instrument used to analyze any job. |
|
|
Term
|
Definition
questionnaire that contains a list of specific tasks that might be done on a job that is being analyzed. *ex: C-JAM |
|
|
Term
|
Definition
Combination Job Analysis Method: uses interviews and questionnaires to collect info on KSAOs and tasks. |
|
|
Term
|
Definition
a family of quantitative techniques that are used to determine the salary levels of jobs. *point method most popular. |
|
|
Term
|
Definition
characteristics that serve as the basis for evaluation (point method): consequences of error on the job, responsibility, education required, skill required. |
|
|
Term
|
Definition
different, but comparable, jobs should be paid the same. |
|
|
Term
purpose of performance appraisal |
|
Definition
*Administrative decisions i.e punishments (termination) and rewards (promotion); union and legal issues. *employee development and feedback *criteria for research |
|
|
Term
|
Definition
a standard by which you can judge the performance of anything. |
|
|
Term
|
Definition
definition of what is good performance. aka theoretical construct. |
|
|
Term
|
Definition
the way in which the theoretical criterion is assessed or operationalized. |
|
|
Term
|
Definition
actual measures something other than the theoretical. *common when people's judgments are used as the actual |
|
|
Term
|
Definition
actual fails to capture theoretical. actual is an incomplete representation of what we are trying to assess. |
|
|
Term
|
Definition
extent to which the actual measures the theoretical. |
|
|
Term
|
Definition
multiple criterion measures are necessary to assess performance adequately. quality vs quantity |
|
|
Term
composite criterion approach |
|
Definition
combining individual criteria into a single score. *disadv: some things more imp than others. |
|
|
Term
multidimensional approach |
|
Definition
does not combine individual criterion measures. *disadv. not as easy to compare employees, but good for individual feedback |
|
|
Term
|
Definition
variability of performance over time. |
|
|
Term
|
Definition
performing extra, voluntary tasks to help co-workers or the organization. |
|
|
Term
|
Definition
counts of behaviors (# days absent from work) or outcomes of behaviors (total monthly sales). *Adv: consistent standards, not biased, easily quantified, face validity. *Disadv: not always applicable, performance not always under individual's control, too simplistic, performance unreliable. |
|
|
Term
|
Definition
most frequently used. people's judgments about performance. |
|
|
Term
|
Definition
most popular subjective measure. focuses on characteristics of the person or person's performance. multipoint scale and several dimensions. 1-5 point scale with poor-outstanding. *p. 85 |
|
|
Term
|
Definition
focuses on specific instances of behavior that the person has done or could be expected to do. *Ex: "Can be counted to be at work on time." |
|
|
Term
|
Definition
Behaviorally Anchored Rating Scale: response choices are defined in behavioral terms. |
|
|
Term
|
Definition
Mixed Standard Scale: provides the rater with several dimensions on behaviors, with each dimension having behaviors assoc with it. Then, rater indicated if subject is better than/as good as/worse than each statement. *Ex. p.92 |
|
|
Term
|
Definition
Behavior Observation Scale: statements of critical incidents reflecting effective or ineffective behavior of an employee. Rater indicates amount of time the employee engaged in behavior (usually percentage). |
|
|
Term
Development of behavior-focused forms |
|
Definition
4 step process involving several people in an organization. *p.93 |
|
|
Term
models of the rating process |
|
Definition
there are several models that influence ratings of performance. Suggest that rating process involves several steps: observing performance, storing information about performance, retrieving info about performance from memory, translating retrieved info into ratings. |
|
|
Term
|
Definition
categories or frames of reference i.e. stereotypes, prototypes |
|
|
Term
|
Definition
a rater gives someone the same rating across all rating dimensions despite differences in performance across those dimensions. |
|
|
Term
|
Definition
rater tends to rate everyone the same. |
|
|
Term
|
Definition
rater rates everyone a the favorable end of the performance scale. |
|
|
Term
|
Definition
rater rates everyone at the unfavorable end of the scale. |
|
|
Term
|
Definition
rater rates everyone in the middle of the scale. |
|
|
Term
|
Definition
reduces the rating error (halo,leniency, etc). behavior-focused(BARS MSS) originally created for this purpose, less idiosyncratic judgments. |
|
|
Term
|
Definition
RET: familiarize raters with rater errors and teach them to avoid these rating patterns. *adv: reduces leniency and halo errors *disadv: less accurate ratings. |
|
|
Term
frame of reference training |
|
Definition
give raters examples of performance and correct ratings. *adv: reduces errors *disadv: not sure accuracy. |
|
|
Term
other factors that influence ratings |
|
Definition
-mood of rater -liking of subordinates -subordinate race/gener/etc. similarity #1 -views of subord's motivation |
|
|
Term
|
Definition
multiple raters to reduce bias. *adv: good for personal growth *disadv: not good to determine promotion |
|
|
Term
|
Definition
provides ways for reducing large amounts of data to summary stats such as means and variances. |
|
|
Term
|
Definition
measures of dispersion indicate the degree to which the observations differ from one another. |
|
|
Term
|
Definition
a dispersion measure that is the arithmetic mean of the squared differences between each observation and the arithmetic mean of the same observations |
|
|
Term
|
Definition
square root of the variance. |
|
|
Term
|
Definition
equation that is used to predict one variable from another. *predictor and criterion variable. |
|
|
Term
|
Definition
technique used to combine the predictive power of several variables to improve prediction of a criterion variable. *ex: high school grades and SAT to predict college grades. |
|
|
Term
|
Definition
allow us to draw conclusions that generalize from the subjects we have studied to all the people of interest. |
|
|
Term
|
Definition
the variability among subjects who receive the same treatment in an experiment. |
|
|
Term
|
Definition
two or more independent variables. |
|
|
Term
|
Definition
used to determine if two groups of subjects differ on a dependent variable. |
|
|
Term
ANOVA: Analysis of Variance |
|
Definition
used to determine if two or more groups of subjects differ on a dependent variable |
|
|
Term
|
Definition
used to determine the significance of effects of two or more independent variables on a dependent variable. |
|
|
Term
|
Definition
used to determine if the correlation between two variables is significantly greater than zero. |
|
|
Term
|
Definition
quantitative way of combining results of studies. |
|
|
Term
|
Definition
explains why two variables relate to one another. |
|
|
Term
|
Definition
a variable that affects the relationship between two other variables |
|
|