Top Banner
Kristian Norling, Intranätverk 2013, 23 May, Gothenburg, Sweden OPTIMISING YOUR CONTENT FOR FINDABILITY
56
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  1. 1. Kristian Norling, Intrantverk 2013, 23 May, Gothenburg, SwedenOPTIMISING YOUR CONTENT FORFINDABILITY
  2. 2. #intranatverk@kristiannorling@intranatverk
  3. 3. Who is here? Your expectations? Kristian? 3 hours One 20 minute break 14.30 Lifetime answer Guarantee on this classIntroduction
  4. 4. THE ENTERPRISE SEARCH ANDFINDABILITY SURVEY/REPORTSIGN-UP & DOWNLOAD 2012 REPORT
  5. 5. As the amount of content continues to increase, newapproaches are required to provide good userexperiences. Findability has been introduced as a newterm among content strategists and informationarchitects and is most easily explained as :A state where all information is "ndableand anapproach to reaching that state.Search technology is readily used to make information"ndable, but as many have realizedtechnology alone isDescription
  6. 6. Search engine optimisation is one aspect of "ndabilityand many of the principles from SEO works in aintranet or website search context.Getting "ndability to work well for your website orintranet is a di$cult task, that needs continuous work.Description
  7. 7. We will start some very brief theory and then use realexamples and also talk about what organisations thatare most satis"ed with their "ndability do.TopicsEnterprise Search Engines vs Web SearchGovernanceOrganisationUser involvementOptimise Content for "ndabilityMetadataBrief Outline
  8. 8. Source: The Enterprise Search and Findability Report 2012IS IT EASY TO FIND THE RIGHTINFORMATION WITHIN YOUR
  9. 9. EUROPE77%MODERATELY/VERY HARD
  10. 10. WHAT ARE THE OBSTACLESTO FINDING THE RIGHTINFORMATION?
  11. 11. EUROPE64.2% POOR SEARCH FUNCTIONALITY47.7% LACK OF ADEQUATE TAGS48.6% INCONSISTENCY IN HOW WE TAGCONTENT47.7% DONT KNOW WHERE TO LOOK
  12. 12. Source: IntranetFocusDATE -THE SILVER BULLET OF ENTERPRISE
  13. 13. Source: Julie HuntENTERPRISE SEARCH:UN-COOL AND MISSION CRITICAL
  14. 14. In Academia search is called InformationRetrieval.It is an old discipline, dating backthousands of years...Basic concepts in Information Retrieval:Recall and Precision, more later...History of Search
  15. 15. Enterprise search is the practice ofmaking content from multipleenterprise-type sources, such asdatabases and intranets, searchable to ade"ned audience.http://en.wikipedia.org/wiki/Enterprise_searchWikipedia De"nition
  16. 16. In the "eld of information retrieval, precision is thefraction of retrieved documents that are relevant to thesearch.Precision takes all retrieved documents into account,but it can also be evaluated at a given cut-o% rank,considering only the topmost results returned by thesystem. This measure is called precision at n or P@n.Source: WikipediaThe Concept of EnterpriseSearch: Precision
  17. 17. Recall in information retrieval is the fraction of thedocuments that are relevant to the query that aresuccessfully retrieved.For example for text search on a set of documents recallis the number of correct results divided by the numberof results that should have been returned.Source: WikipediaThe Concept of EnterpriseSearch: Recall
  18. 18. M number ofrelevant documentsN number ofretrieved documentsR number ofretrieved documentsthat are also relevantPrecision and Recall
  19. 19. Recall = R / M =Number of retrieved documents that arealso relevant / Total number of relevantdocuments.Precision = R / N =Number of retrieved documents that arealso relevant / Total number of retrieveddocuments.Precision and Recall
  20. 20. ...enterprises typically have to use other query-independent factors, such as a documents recency orpopularity, along with query-dependent factorstraditionally associated with information retrievalalgorithms. Also, the rich functionality of enterprisesearch UIs, such as clustering and faceting, diminishreliance on ranking as the means to direct the usersattention.RelevanceSource: Wikipedia
  21. 21. PageRank
  22. 22. Enterprise data simply isnt like web orconsumer data its characterised byrarity and unconnectedness rather thanpopularity and context.Charlie Hull, Flax BlogWeb/Consumer Data vs
  23. 23. We do not have PageRank......but we have the bene"t of social!CMSWire: Social Reconnects Enterprise SearchEmails, People Catalogues, Connections, Tagging,Sharing etc.Relevance
  24. 24. The Concept of Enterprise Search
  25. 25. Organisation Resources!IntranetFocus: Enterprise Search Team Management Work with all Stakeholders = The wholeorganisationDe"ne processes, roles and routines togovern the solution Help publishers get started by creatingprocesses for better "ndability Create easy to use administrationinterfaces
  26. 26. Amongst the organisations that are very satis"ed withtheir search, they have a (larger) budget, moreresources and systematically work with analysingsearch.As many as 45% of the respondents have no separatebudget for search, but 20% have had a budget for 3years or more. In the group with no budget 56% arevery or mostly dissatis"ed with their current search.The dissatisfaction with search drops to 30% for thoseorganisations with a dedicated budget for search. InSurvey Results of Budget and
  27. 27. In the Very Satis"ed (VS) with their current searchgroup, the number of Full Time Equivalents (FTE) is 1-2or more. 67% of VS and 71% of the mostly satis"ed groups dosearch analytics 50% do user testing regularly in the very satis"edgroup 83% (VS) have a person or group that is responsiblefor analysing user behaviour and to make sure thatsearch supports the business needsWhat Does the Organisations Do
  28. 28. Search Manager Search Technology Manager Information Specialist Search Analytics Manager Search Support ManagerBy Martin White, IntranetFocusSearch Team
  29. 29. Organisation Not a project! Time and Money important Measure, KPIs/Search AnalyticsCIO.com: How to Evaluate Enterprise SearchFindability Blog: Building a Business Case forEnterprise Search
  30. 30. CONTENT STRATEGY@jcolman: How to Build SEO into Content Strategy
  31. 31. Governance Information Quality, with KPI Metadata Quality, with KPI Information Lifecycle Management- Time to live for di!erent contenttypes- Archive, delete or keep? SimCorp example Search Analytics on regular basis
  32. 32. User Involvement Get to know your users and their needs Make sure your solution is easy to use Perform continuous usability evaluations, likeusage tests and expert evaluations Make sure users "nd what they are looking for Enable feedback loops for complaints,feedback and praise Examples: Nordea, VGR and many more
  33. 33. Good Data/Information hygiene Crap in = Crap out Metadata is very important!Presentation: Taxonomy and Metadata demysti"edVideo: TetraPak exampleVideo: VGR exampleInformation
  34. 34. Information Clean up and archive or delete outdated/unrelevant information Ensure good quality of information byadding structured and suitable metadata Information Architecture and taxonomiesEarly & Associates: 10 Common Mistakes WhenDeveloping Taxonomies TaggingPresentation: Social Tagging, FolksonomiesControlled Vocabularies
  35. 35. METADATA
  36. 36. Listyeraze
  37. 37. svenwerk
  38. 38. DEWEY DECIMAL CLASSIFICATION
  39. 39. KristianNorling
  40. 40. Author: Douglas CouplandTitle: Generation APublisher:Windmill BooksYear: 2009Printed by: CPI Cox & WymanFirst published: 2004
  41. 41. MetadataSemantic
  42. 42. MetadataTitlesExample: Ernst & YoungVery Important Content Quality Information Life Cycle ManagementESEO: Actionable activities
  43. 43. Manually - Editors Automatic - Software Semi-automatic - Software + Editors Tagging - Users (+Software)VGR Example: How to add metadataThomas Vander Wal:Integrating Folksonomies With Traditional MetadataWays to add metadata
  44. 44. Bene"t of Search AnalyticsWhat metrics are interesting?Actions to take based on search analyticsDos and dontsSearch Analytics
  45. 45. SEARCH ANALYTICSGIVES USER INTENT
  46. 46. Important, delivers actionable to-dos quickly0-resultsTop Terms Searched forVideo: Search Analytics in PracticeSearch Analytics
  47. 47. Know what information is mostwanted and work with that Promote information when it is indemand Are search queries seasonal? Find synonymsActions to take
  48. 48. ...Fix 0-results...Check common terms...Cluster synonyms...Use Key Matches / Best Bets /Sponsored LinksDo
  49. 49. A FEW HOURSEVERY MONTH,CAN DELIVERGREAT RESULTS!
  50. 50. ...Check user behaviour?...Research in what context?...Look at trending/temporal termsDo - bonus
  51. 51. ...Forget to work with your content...Forget metadata...Only use search analytics - combine withweb analyticsDo not
  52. 52. SEARCH ANALYTICS FOR YOUR SITEConversations with Your Customersby LOUIS ROSENFELD@louisrosenfeldFantastic book
  53. 53. Involve the users (and stakeholders!) Allow user input (forms) Training for editors and publishers Set up simple guidelines (E&Y) Lifecycle Manage Information Do Search Analytics Measure and follow-upSummary
  54. 54. Create an information architecture or at least acontent model, answering the questions; What goeswere, what information are related and how should itbe possibly to access the information? Ensure that allinformation is mapped in this manner and if new typesof information arise that doesnt "t the model, reviseand restructure (not refactor). Make sure thatinformation architecture is not optional butmandatory.Bonus (SharePoint) tip 1
  55. 55. The way forward in a more complex informationlandscape is metadata and search. Use the term storeto create taxonomies and metadata structures, add asmuch needed information as possible and apply themto the information through the content types in SP, toall the information.Applied term store information can be directlyaccessed via search as facets which is a very powerfultool to quickly navigate to the correct information. Theterm store also gives you other possibilities to createBonus (SharePoint) tip 2
  56. 56. Socialise your content and make sure that user inputcounts towards search relevance and the overallinformation architecture. User input can be manifestedas explicit or implicit. Explicit as likes or comment onthe information, implicit via search logs. The explicitinput is quite straight forward but might need a criticalmass to become relevant e.g. More likes = higherrelevance. Implicit via search logs needs more analysisbut will give more leverage.Bonus (SharePoint) tip 3