A Violent History of Benevolence traces how normative histories of liberalism, progress, and social work enact and obscure systemic violences. Chris Chapman and A.J. Withers explore how normative social work history is structured in such a way that contemporary social workers can know many details about social work's violences, without ever imagining that they may also be complicit in these violences. Framings of social work history actively create present-day political and ethical irresponsibility, even among those who imagine themselves to be anti-oppressive, liberal, or radical.
The authors document many histories usually left out of social work discourse, including communities of Black social workers (who, among other things, never removed children from their homes involuntarily), the role of early social workers in advancing eugenics and mass confinement, and the resonant emergence of colonial education, psychiatry, and the penitentiary in the same decade. Ultimately, A Violent History of Benevolence aims to invite contemporary social workers and others to reflect on the complex nature of contemporary social work, and specifically on the present-day structural violences that social work enacts in the name of benevolence.
This book is your definitive guide to the rapidly growing role of Quantitative User Experience (Quant UX) Research in product development. The book provides an overview of the skills you need on the job, presents hands-on projects with reusable code, and shares advice on starting and developing a career. The book goes beyond basic skills to focus on what is unique to Quant UX. The authors are two of the most widely recognized practitioners in Quant UX research, and this book shares insights from their combined decades of experience.
Organizations today have more data about user needs and behaviors than ever before. With this large-scale data, Quant UX researchers work to understand usage patterns, measure the impact of design changes, and inform strategic decisions. In the Quant UX role, interdisciplinary researchers apply analytical skills to uncover user needs, inform engineering and design, answer strategic business questions, and optimize software and hardware products for human interaction. This book provides guidance around customer satisfaction surveys, understanding user behavior from log analysis, and the statistical methods that are commonly used to assess user outcomes.
A Violent History of Benevolence traces how normative histories of liberalism, progress, and social work enact and obscure systemic violences. Chris Chapman and A.J. Withers explore how normative social work history is structured in such a way that contemporary social workers can know many details about social work's violences, without ever imagining that they may also be complicit in these violences. Framings of social work history actively create present-day political and ethical irresponsibility, even among those who imagine themselves to be anti-oppressive, liberal, or radical.
The authors document many histories usually left out of social work discourse, including communities of Black social workers (who, among other things, never removed children from their homes involuntarily), the role of early social workers in advancing eugenics and mass confinement, and the resonant emergence of colonial education, psychiatry, and the penitentiary in the same decade. Ultimately, A Violent History of Benevolence aims to invite contemporary social workers and others to reflect on the complex nature of contemporary social work, and specifically on the present-day structural violences that social work enacts in the name of benevolence.
Chris initially developed the systematic simplicity approach explored in this book working as a consultant with BP International for eight years in the 1970s and 80s on offshore North Sea oil projects. When the BP board approved the first project applying this approach, they mandated its use worldwide for all large or sensitive projects. The BP objectives included achieving 'risk efficiency' (a minimum level of risk for any given level of expected reward) in a 'clarity efficient' manner (a maximum level of relevant clarity for any given level of effort/cost) plus the delivery of projects on time and within budget. These objectives were realised for the decade this approach was employed, prior to placing more risk with contractors and a portfolio of other interrelated corporate changes. IBM UK used Chris in a central role for a 1990s culture change programme addressing what is now seen as 'opportunity management', adapting a version of the BP approach to enable all IBM staff to avoid risk of the wrong kind, but take more risk of the right kind, understanding the difference, and understanding the difference between good luck and good management, bad luck and bad management.
This book explores the basic deliverables of the systematic simplicity approach used by IBM and its BP foundations as subsequently employed by many other adopters in Part 1. Parts 2 and 3 address further aspects of project, operations and corporate management, including strategy formation, safety and the processes underlying all systematic simplicity approaches. They use further examples based on extensive Ontario Hydro, National Power, UK Nirex, Railtrack and UK MoD consultancy engagements.
Routledge published the book 'Enlightened Planning' by Chris Chapman in 2019. It generalises the systematic simplicity approach and associated critiques of common practice in the 2011 Wiley book 'How to Manage Project Opportunity and Risk' by Chris Chapman and Stephen Ward, the extensively revised and retitled third edition of their 1997 bestseller 'Project Risk Management'. These books received strong endorsements from a wide range of international experts, but they provide a level of detail some readers of this book may not need.
This book has been written for a very wide audience, to provide a concise and relatively short but comprehensive introduction to the systematic simplicity concepts and operational tools covered by the book 'Enlightened Planning' plus the underlying earlier literature it builds upon. It is about how systematic simplicity can deliver what all 'best practice' ought to deliver.
The 2nd edition of R for Marketing Research and Analytics continues to be the best place to learn R for marketing research. This book is a complete introduction to the power of R for marketing research practitioners. The text describes statistical models from a conceptual point of view with a minimal amount of mathematics, presuming only an introductory knowledge of statistics. Hands-on chapters accelerate the learning curve by asking readers to interact with R from the beginning. Core topics include the R language, basic statistics, linear modeling, and data visualization, which is presented throughout as an integral part of analysis.
Later chapters cover more advanced topics yet are intended to be approachable for all analysts. These sections examine logistic regression, customer segmentation, hierarchical linear modeling, market basket analysis, structural equation modeling, and conjoint analysis in R. The text uniquely presents Bayesian models with a minimally complex approach, demonstrating and explaining Bayesian methods alongside traditional analyses for analysis of variance, linear models, and metric and choice-based conjoint analysis.
With its emphasis on data visualization, model assessment, and development of statistical intuition, this book provides guidance for any analyst looking to develop or improve skills in R for marketing applications.
The 2nd edition increases the book's utility for students and instructors with the inclusion of exercises and classroom slides. At the same time, it retains all of the features that make it a vital resource for practitioners: non-mathematical exposition, examples modeled on real world marketing problems, intuitive guidance on research methods, and immediately applicable code.
This book is a complete introduction to the power of R for marketing research practitioners. The text describes statistical models from a conceptual point of view with a minimal amount of mathematics, presuming only an introductory knowledge of statistics. Hands-on chapters accelerate the learning curve by asking readers to interact with R from the beginning. Core topics include the R language, basic statistics, linear modeling, and data visualization, which is presented throughout as an integral part of analysis.
Later chapters cover more advanced topics yet are intended to be approachable for all analysts. These sections examine logistic regression, customer segmentation, hierarchical linear modeling, market basket analysis, structural equation modeling, and conjoint analysis in R. The text uniquely presents Bayesian models with a minimally complex approach, demonstrating and explaining Bayesian methods alongside traditional analyses for analysis of variance, linear models, and metric and choice-based conjoint analysis.
With its emphasis on data visualization, model assessment, and development of statistical intuition, this book provides guidance for any analyst looking to develop or improve skills in R for marketing applications.
Network Performance Security: Testing and Analyzing Using Open Source and Low-Cost Tools gives mid-level IT engineers the practical tips and tricks they need to use the best open source or low cost tools available to harden their IT infrastructure. The book details how to use the tools and how to interpret them. Network Performance Security: Testing and Analyzing Using Open Source and Low-Cost Tools begins with an overview of best practices for testing security and performance across devices and the network. It then shows how to document assets--such as servers, switches, hypervisor hosts, routers, and firewalls--using publicly available tools for network inventory.
The book explores security zoning the network, with an emphasis on isolated entry points for various classes of access. It shows how to use open source tools to test network configurations for malware attacks, DDoS, botnet, rootkit and worm attacks, and concludes with tactics on how to prepare and execute a mediation schedule of the who, what, where, when, and how, when an attack hits.
Network security is a requirement for any modern IT infrastructure. Using Network Performance Security: Testing and Analyzing Using Open Source and Low-Cost Tools makes the network stronger by using a layered approach of practical advice and good testing practices.